You can create a .txt file with all the “knowledge” (so copy there documentation, examples, Defold API, maybe some source code of your projects or so) that you want such GPT to look up before answer (it looks like you have to ask for this explicitly though every time you want it to look it up), so it becomes your better search engine - ask it and it should give you an answer based on that knowledge, so you are avoiding situations when it hallucinates and creates functions that are not existing (still happens though sometimes, so double check).
You can also “tell it” in what way it should answer you, e.g. “assume I know Lua, always answer concisely and shortly and explain/elaborate only when I ask to explicitly” or something like this.
Btw. I’m also testing out Cursor and extensions for VSCode like Codeium or twinny (which uses locally hosted LLMs, so worse than through API of the best LLMs, but it’s the one I managed to run on a laptop with 32 GB RAM). Cursor and Codeium looks like in that direction programming is going - it reads your code, analyses it, sometimes can help you find a bug, a problem or is really useful with documentation (e.g. I tested it while writing documentation to Squid and only few times I had to fix few things manually later on). I still don’t think it’s useful with coding, if you are not aware what you are writing, because it might inject nasty bugs you will be having problems finding or might give you solution not suited to your needs, but in general I do admit it gets better and better somehow. I can’t use them for my job, but maybe locally run, isolated and private solutions will be widespread and useful in the future.
I don’t test Copilot, because it prompts to ask for access to all my repositories, even private, I don’t know if there is a way to avoid it.