Render target as a texture resource

I would like to draw something to a render target and then use it as a normal texture resource.

I have already used render target for some postprocessing; but for postprocessing one binds the render_target to a render unit in the render_script and use it as a texture in some material (also binded in the render_script).

I would like to be able to create a texture resource and use it, say, with a mesh without binding it in the render_script.

I hope this makes sense and that someone has already done this with Defold.


Hmm, is this achievable currently @jhonny.goransson ?

You cannot do it without render_script.
But you can pass render target to any material for predicates you want.
I’ll make an example for you with a normal texture later today.

1 Like

Here the example of how to write normals to rt and then read in another shader:

I hope this is helpful, if you have additional questions just ask :slight_smile:



There is however a misunderstanding. When I wrote “a normal texture” I didn’t mean “a texture for normal data”, I just meant “a usual texture” as any other texture resource. :slight_smile: My fault, low level english here…

I have already used RT for post processing. But I would like to create a texture in runtime (using RT) and then use it for a mesh in the game as any other texture resource. I would greatly prefer not “to pollute” the render_script with game specific details.

I hope I have been able to better explain my question.


I am afraid the only way to bind RT is render_script, but the normal texture is just an example, you can do the same for the mesh with any other data :slight_smile:

Could you describe your case in more detail?

Thanks again!

I really appreciate your help.

As I said I know how to use RT and materials in render_script; I have already implemented a distortion postprocess. However the point is: I don’t want to have code in the render_script for a very specific moment in the game. I would prefer to keep the render_script as agnostic as possible with respect to the game.

Let me make an example of one of the possible use cases for me. When the player beats a boss I want to show a waving flag with some data written on the flag (time for beating the boss, some localized text). So, you see, the texture for this flag cannot, in any reasonable way, be prepared in design time; it must be composed in runtime. On the other hand, in order to have the waving effect, I can use a shader or I can animate the vertices of a mesh.

Of course I can write some code in render_script drawing certain predicates on a RT, bind the RT to a texture unit, bind the material of the mesh and draw the mesh with its own predicate. All this is clear to me, no problem. BUT I don’t like this approach since I am writing code in the render_script that is too specific for that particular result screen after beating a boss. And then, probably, I want to use something similar, but maybe not identical, in another specific point in the game, so I need some more code in the render_script. I hope I have conveyed the idea…

Note also that with this approach, the texture for the flag is redrawn (identical to itself) for each frame I want to have the flag on screen, and this is not optimal.

On the other hand, what I hoped to do is: (1) draw in the render_script the flag in a RT (2) generate a texture resource with this RT (3) use this texture resource in a mesh. In this way the render_script is a bit more decoupled from the game code and the RT is not redraw in each frame.

My apologize for the long post. I hope it makes somehow sense…

1 Like

Did you consider creating runtime texture?

Yes, I read that.

But can I create the texture in the update of the render_script? and, if I could, how can I reference the buffer of a RT when creating the texture? Or, in the other way around, may I create a texture in advance and then fill its buffer with the data of an RT drawn in the render_script update? But how…

I think we are lacking some small bits in the engine to support this. We need either:

  1. a way of getting the texture handle from a render target + a way of binding it to a resource the same way as we do for buffers (pseudo-code):
-- in render script:
local t_handle = render.get_texture(self.my_rt), "my_rt", { texture = t_handle })
-- in some gameobject script:
resource.set_texture("some_path", { transfer_ownership = true },self.my_texture)
  1. a way of setting a render target attachment from a texture:
-- in gameobject script
local info = resource.get_texture_info(...), { texture = info.texture })
-- in render script
render.render_target( { [render. BUFFER_COLOR_BIT] = { texture = self.my_texture } } )

We almost have all the pieces, we just need to figure out the API…


Thanks! I see what you mean! Now I have a confirmation that for the moment this is not possible.

I will change the design, no more waving flags :joy:

But it is something that we should support. I refactored how we represent graphics resources in the engine to be able to do stuff like this, so I might take a look at it when I have some free time :sweat_smile:


Thank you very much!

I have spent the last two years using Unity. Eventually I had learnt not to hate it, and it wasn’t easy. Then, for a very simple project, I went back to Defold and realised that I might as well uninstall Unity and use Defold for the next (real) game too! I prefer Defold to Unity for 2D for many, many reasons!

In these two years, Defold has grown so much. But in Unity this thing of creating a texture resource from an RT was possible and it would be nice to have it in Defold too.

Anyway, long live Defold :slight_smile:


Sure, I’ll make it happen!