I would like to draw something to a render target and then use it as a normal texture resource.
I have already used render target for some postprocessing; but for postprocessing one binds the render_target to a render unit in the render_script and use it as a texture in some material (also binded in the render_script).
I would like to be able to create a texture resource and use it, say, with a mesh without binding it in the render_script.
I hope this makes sense and that someone has already done this with Defold.
You cannot do it without render_script.
But you can pass render target to any material for predicates you want.
I’ll make an example for you with a normal texture later today.
There is however a misunderstanding. When I wrote “a normal texture” I didn’t mean “a texture for normal data”, I just meant “a usual texture” as any other texture resource. My fault, low level english here…
I have already used RT for post processing. But I would like to create a texture in runtime (using RT) and then use it for a mesh in the game as any other texture resource. I would greatly prefer not “to pollute” the render_script with game specific details.
I hope I have been able to better explain my question.
As I said I know how to use RT and materials in render_script; I have already implemented a distortion postprocess. However the point is: I don’t want to have code in the render_script for a very specific moment in the game. I would prefer to keep the render_script as agnostic as possible with respect to the game.
Let me make an example of one of the possible use cases for me. When the player beats a boss I want to show a waving flag with some data written on the flag (time for beating the boss, some localized text). So, you see, the texture for this flag cannot, in any reasonable way, be prepared in design time; it must be composed in runtime. On the other hand, in order to have the waving effect, I can use a shader or I can animate the vertices of a mesh.
Of course I can write some code in render_script drawing certain predicates on a RT, bind the RT to a texture unit, bind the material of the mesh and draw the mesh with its own predicate. All this is clear to me, no problem. BUT I don’t like this approach since I am writing code in the render_script that is too specific for that particular result screen after beating a boss. And then, probably, I want to use something similar, but maybe not identical, in another specific point in the game, so I need some more code in the render_script. I hope I have conveyed the idea…
Note also that with this approach, the texture for the flag is redrawn (identical to itself) for each frame I want to have the flag on screen, and this is not optimal.
On the other hand, what I hoped to do is: (1) draw in the render_script the flag in a RT (2) generate a texture resource with this RT (3) use this texture resource in a mesh. In this way the render_script is a bit more decoupled from the game code and the RT is not redraw in each frame.
My apologize for the long post. I hope it makes somehow sense…
But can I create the texture in the update of the render_script? and, if I could, how can I reference the buffer of a RT when creating the texture? Or, in the other way around, may I create a texture in advance and then fill its buffer with the data of an RT drawn in the render_script update? But how…
But it is something that we should support. I refactored how we represent graphics resources in the engine to be able to do stuff like this, so I might take a look at it when I have some free time
I have spent the last two years using Unity. Eventually I had learnt not to hate it, and it wasn’t easy. Then, for a very simple project, I went back to Defold and realised that I might as well uninstall Unity and use Defold for the next (real) game too! I prefer Defold to Unity for 2D for many, many reasons!
In these two years, Defold has grown so much. But in Unity this thing of creating a texture resource from an RT was possible and it would be nice to have it in Defold too.
Hello, I consider to use Defold for my next project and this feature would be very helpful. Is there any information when it might be implemented (if not already)? Thanks!
I think @Pawel might have one or two, but I think we should make an example. Not sure what a good example project would be however, maybe some sort of texture post-processing example?
It’s a bit embarrassing that I’m not making much progress at all with this.
The idea is to have an avatar which can be designed by the user, with loads of different hair/eye/nose/etc combinations. Drawing of the avatar takes place off screen, and would then be rendered to a texture to use in-game.
I’ve set up the avatar design, and created a render target.
In my render script I’ve added the following test code, just before the GUI render takes place (ignore the values - the real numbers will be added when I can get it working!)
-- * AVATAR BOY
if do_test_once == false then
do_test_once = true
-- * set projection to off-screen avatar (boy)
local avatar_half_x = 340 / 2 -- * set to avatar width
local avatar_half_y = 300 / 2 -- * set to avatar height
local pos_x = 300 -- * set to avatar off-screen position x
local pos_y = -360 -- * set to avatar off-screen position y
local window_proj = vmath.matrix4_orthographic(pos_x - avatar_half_x, pos_x + avatar_half_x, pos_y - avatar_half_y, pos_y + avatar_half_y, DEFAULT_NEAR, DEFAULT_FAR)
render.set_projection(window_proj)
-- * now we set up the target for our render and clear it with the clear color (self.clear_color)
render.set_render_target("avatar_boy")
render.clear({
[render.BUFFER_COLOR_BIT] = vmath.vector4(0, 0, 0, 0),
[render.BUFFER_DEPTH_BIT] = 1,
[render.BUFFER_STENCIL_BIT] = 0})
-- * this is the actual draw
-- -- Draw all objects with the "tile" tag in their material.
render.draw(predicates.gui, camera_world.frustum)
-- * Reset the render target.
render.set_render_target(render.RENDER_TARGET_DEFAULT)
end
I’m sorry to say but I’ve no idea if this is the correct code, and even if it were how to use the render target as a texture in my GUI.