How to pass scene depth buffer to fragment shader as a texture? (DEF-2699)

I want to use the depth buffer values of a scene for post processing effects, such as doing SSAO or scene fog.

Like this kind of thing http://www.roxlu.com/2014/036/rendering-the-depth-buffer

https://learnopengl.com/#!Advanced-Lighting/SSAO

2 Likes

I should be able to do this?

render.enable_texture(0, self.effect_render_target, render.BUFFER_COLOR_BIT)
render.enable_texture(1, self.effect_render_target, render.BUFFER_DEPTH_BIT)

And then this should work?

uniform lowp sampler2D DIFFUSE_TEXTURE;
uniform lowp sampler2D DEPTH_BUFFER;

Right now I’m getting

Render target does not have a texture for the specified buffer type.
stack traceback:
	[C]: in function 'enable_texture'

When trying

render.enable_texture(1, self.effect_render_target, render.BUFFER_DEPTH_BIT)

I do have a texture setup as a sampler in the material the render target is using.


Is there a sample render script I could look at to see how it’s done correctly?

Here is render target being setup. What’s missing?

	local color_params = { format = render.FORMAT_RGBA,
	                       width = render.get_window_width(),
	                       height = render.get_window_height(),
	                       min_filter = render.FILTER_LINEAR,
	                       mag_filter = render.FILTER_LINEAR,
	                       u_wrap = render.WRAP_CLAMP_TO_EDGE,
	                       v_wrap = render.WRAP_CLAMP_TO_EDGE }
   	local depth_params = { format = render.FORMAT_DEPTH,
	                       width = render.get_window_width(),
	                       height = render.get_window_height(),
	                       u_wrap = render.WRAP_CLAMP_TO_EDGE,
	                       v_wrap = render.WRAP_CLAMP_TO_EDGE }    	
    
    
    	
    self.effect_render_target = render.render_target("effect_render_target", {[render.BUFFER_COLOR_BIT] = color_params, [render.BUFFER_DEPTH_BIT] = depth_params})

Sorry, you’re right, we only support color buffers at the moment. I filed an issue for this, DEF-2699

2 Likes

Is that implemented yet? Can we access depth buffer as a texture now?

No, sorry, not many are asking for it so other things keep taking precedence.

1 Like

People just don’t know what possibilities it opens up, probably.
I want to make Depth of Field effect with a simple blur instead of the bokeh for starters.
Without that feature I can still render to another render target to imitate the depth buffer, right?

1 Like

Yes, of course, you can use render.enable_material to switch to a shader which draws the depth as a gray-scale or similar.

2 Likes

Looks like only two textures are allowed per material? I tried passing original texture, depth texture and lut texture to my postprocessing shader and color grading no longer works…
Is it hard to support more than two textures per material?

That should not be the case. Could you try something safer/simpler to verify that the third texture is actually ignored? The only limiting factor should be the number of texturing units.

1 Like

Can current emphasis on 3D features get this issue solved too? It’s needed for a variety of shader effects.

What you need to do is to bind a texture to the depth buffer. This is not supported by the core GLES2.0 functionality, which we guarantee to support. You can however achieve this using the OES_depth_texture extension.
We are currently focusing heavily on both 3D-rendering functionality and performance and part of that is to be able to use OpenGL extensions and functionality to check support etc.

It will be worked on over the next few sprints, I can’t however give you a more precis estimate at the very moment, more than to say it does have very high priority given how much it will bring in to the 3D-support of Defold.
But, one will need other functionality in place for this to be useful, something that is taken into account and worked on simultaneously.

5 Likes

DEF-2699

Looks like this is still an issue?

Yeah unfortunately there’s no extension support available in the render scripts that could enable this functionality. I know it has been up for discussion at some point, but I don’t know if was ever a design drafted for it. Perhaps @Mathias_Westerdahl or @sven has some input to it? I don’t think we can stick to the ES2.0 vanilla spec forever if we want to move forward with better 3D support, but it’s not up to me to decide :slight_smile:

2 Likes

Could someone expand on how to do this? Rendering the depth buffer in grayscale would be great for debugging, I think. I’m a bit out of my depth though, if you pardon the pun.

You can render multiple passes to render targets. I assume do a normal render first and then one where the shader of what you want was changed to a sort of depth shader. Then composite those (such as by supplying them as textures for another shader) along with GUI and whatever render predicates you use.

This might help but it’s been a long while since I tested this, beware it’s old and some of the defaults have changed possibly.

You list the materials you want to use with this in the .render file.

ex

render.enable_material("depth")
render.draw(self.model_predicate)
render.disable_material()

Edit: After thinking about your specific use case all you need to do is add a depth material to your .render and then have a key in your game which sends a message to @render and in your render script toggle enabling the depth material when drawing your model predicate. Should be very easy, then mess with the depth fragment program until you get a look that’s useful for you debugging. For getting and using the camera near / far you’ll want to set it in a constants buffer and then use it when rendering the model predicate.

Example

4 Likes

Thanks a bunch! Looks like what I need. Whenever I mess around with the renderscript, it’s the depth buffer that’s giving me trouble most often. Being able to actually see what’s happening with it is bound to help me in those cases.

Edit: Works like a charm! Still not sure about the “why” when it comes to the glitches I’m experiencing, but at least I can understand the “how” now, if that makes sense.