Depth buffer copy

Hello, I have 3d scene with 2d sprites. I want to render sprite to separate render target and apply some post effects to it (like blur and gloom). But I need somehow to pass Depth Buffer from main render target to post effect RT to apply depth mask and cut sprite.
I think easiest way to do it would be copy depth buffer from one RT to another, but I did find way to access it.
Please suggest me some solution to achieve it or should I create issue and request some feature for it?

You can create your depth buffer of the render target with the flag parameter set to render.TEXTURE_BIT:

…or if you are using the render target resource you can just enable the “depth texture storage” checkbox:

Now that the depth buffer is a texture you can bind it to a texture unit by using render.enable_texture :

1 Like

I do not get what to do with this texture after calling render.enable_texture. That means texture will be passed as sampler to the shader, not to the render targer depth buffer

What exactly do you want to do with the depth buffer? Do you need to use hardware depth testing to reject pixels? Otherwise you can just sample from th depth buffer in your post processing pass and do what ever you need with the values there

I want to hide parts of sprite that behind 3d objects and render to the buffer only visible part of the sprite

Alright in that case you can attach the depth buffer as a texture in the post processing pass and use discard to reject the samples.

I dont get how to do so. What is statment should be in this case?

vec2 duv = vec2(var_gl_position.xy + 1) / 2;
    
    float depth = texture2D(tex1, duv).r;

    if (gl_FragCoord.z > depth) 
    {
        discard;
    }

seems that works

1 Like

So there is no way to attach depth buffer from one RT to another? E.g. we do deferred lighting and need to render transparent materials in a separate forward pass after scene was lit?

Not change attachments but you can copy depth values in a shader

you can enable DepthBuffer as texture in shader since 1.6.0

Linearised?
So you check your z against linearised depth value to discard?

You can linearize manually, that’s not something a GPU does automatically

I honestly don’t remember what I did for this picture. But I remember that in the final result I dropped this idea and used MRT rendering to calculate depth as a distance from the camera for a fragment, where I simply wrote values into one of the textures, in a format that was further clear to me (used for TiltShift effect). Depth Buffer I think is not linear, it was difficult for me to understand the final values.

2 Likes

Ok, passing depth as texture works for me.
If someone wants to try it and guessing what is var_gl_position.xy in @zendorx code, it is clip space position (gl_Position passed from vertex program) divided by W component.

1 Like