Questions about engine memory usage and speed scene (DEF-2695)


I slice project for every layer. One layer utilization about 150-200mb memory.



Would you mind sharing (via Defold Dashboard) your project with me ( so I can take a look and see if it happens on OSX as well?

1 Like


Send in PM link for early build. Memorry issues present anyway.



Thanks. I don’t see anything obviously wrong and the game behaves well on OSX.



What do you mean by “one layer”. Are you talking about Gui layers?



Sorry. I mean one collection - our level сonsists of 11 collection/GO. Single collection or GO using 150-200MB.

BTW, when project build Editor utilization about 6GB ut nomally?



I take that back. I took one more look this morning:

You have many tilesources with massive tiles: layer01_01.tilesource has 6 tiles, each being 9851130 and mostly consisting of transparent pixels. This tilesource alone eats a ton of memory: 985113064 = 24Mb and you have several with a similar size. Some of them seem to consist of low resolution images of what looks like trees or something that have been scaled to fit the tilesize of 985x1130.

You should reconsider this design as those tilesources will use a lot of memory when loaded. The ground pieces could still be kept in a tilesource perhaps, but you should in that case get rid of all the transparent pixels above each ground piece. The trees could perhaps be stored in a normal atlas in a much lower resolution and then scaled to the appropriate size instead of having them pre-scaled.



@britzl thank you for idea. We think about this.



Hi, again!

Again thank you @britzl for idea delete transparent pixels from images. We reduced memory usage to 500Mb from 1Gb, but problem anyway present. I ready send our last test build for investigate, but I think problem about with non discreet videocards. On machines with discreet videocards, memory usage about 60-70Mb on same build, but on notebooks without discreet videocards or with hybrid videocards(discreet videocard not usage) memory usage about 500-600Mb. May be you have hint about resolve a problem.

If you need build for investigate I send it to PM.

PS: We used last Defold version



Well, if the video memory is shared with the normal RAM memory then I’m not surprised if the massive textures that you had took several hundred megabytes of run-time memory. How many atlases and tilesources do you have in your game now and what are their dimensions?



Some images have fullhd dimensions. We have 25 atlas and 5 tilesources.



How many of them are loaded at a time, and what are their dimensions? Remember that one 4096x4096 atlas takes up 67Mb of memory uncompressed. You should also look into Texture Profiles and make sure to use them to your advantage to reduce memory usage on your target platforms.



I counted active atlas and tilesources for level. I create collection proxy and some GO Factory in process. Atlas for non created GO load with collection always or only for available object?

I tried use Texture Profiles, but memory usage not changed. I tried configure Texture Profiles on Editor 2, may be not correct. I will try Editor 1 texture profiles configuration.

You recommended make one big atlas and non make many small?



Hi, again!

It was long way to resolve problem(I understand my problem), but I have one request for engine. My problem was in not right detect High-Perfomance Video Card on notebooks with Hybrid videocards. Defold not use High-Perfomance Video Card by default(If in, for example, NVIDIA Control Panel set Auto-Select for Video Card in Hybrid system). If I set always use High-Perfomance Video Card then my project usage Video Card memory and not usage NoteBook RAM and avarage memory usage about 70-150Mb for large images.

My request - can you add function in Defold for Auto-Select High-Perfomance Video Cards for Notebooks?

1 Like


I’m not sure how we decide which graphics card to use in the engine. Maybe @sven or @Mathias_Westerdahl knows?



We currently rely on GLFW to create our window for us on desktop platforms. I cannot say off the top of my head what we need to do in order to make the engine select a better card. But anyways, I’ve created a ticket for it: DEF-2695


Back to the 90's - Weird GFX bug

I have a gaming laptop with two GPUs, and my guess after lots of “wtf why does this game launches on Intel instead of Nvidia” is that it is the GPU driver that tells games from apps. Also there’s a dropdown in Windows with lots of .exe files that seems to be the .exe’s of what games known to Windows.

But I ended up setting everything to run on Nvidia GPU since the autoguess option (default) worked very poorly with even Civ VI launching on Intel rather than Nvidia.



If you are not lucky enough for the driver to detect the game you have to use NVAPI. Maybe could get into contact with driver vendors can get the Defold engine runtimes marked however that works.

There are hacky ways to force it depending on OS too.

1 Like


I make this too(enable always use NVIDIA GPU), but our potential players, may be cannot make this and need make instruction for players.



You answer for me? About use NVAPI. I think I cannot use NVAPI or need make Native Extention for then?