Dx11 depth buffer > distance

How would I map the output of the depth buffer (normalised values) back to vvvv units, so a pixel on a sphere 10.3 units away from me will give a depth value of 10.3?

w

(ie back to view space…I remember reading the depth map is not linear)

ok, it seems you just need to multiply the depth value by the inverse projection transform

so this multiplication works when patched -

vector (0,0,depth value) > Multiply (3d) with the inverse PT

but in a shader I don’t get the same result using -

col = mul(col, InverseProjection); where col = (0,0,depth,1)

The result of this function in the shader is that the blue (x) channel becomes 1…

Any ideas?

When sampling depth buffer you only need the red channel, so make sure you only get that.

You don’t need to multiply by inverse projection either, you can but it’s more expensive (and please note that math in built in perspective transform node is wrong, near plane is twice the value in M33 component, go figure why…, so in some cases you might also have little issues).

To get linear depth, from value d = your depth buffer value and tP being your projection matrix.

float ld = tP._43 / (d - tP._33);

then to get back view space :

float3 ViewSpace(float2 ScreenSpace,float linearDepth)
{
    float2 screenSpaceRay = float2(ScreenSpace.x / tP._11,
                                   ScreenSpace.y / tP._22);
    
    float3 posView;
    posView.z = linearDepth;
    posView.xy = screenSpaceRay.xy * posView.z;
    
    return posView;
}

Have this shader around, i’ll add it in girlpower, but should be easy to rewrite as tfx, good exercise ;)

Please note that blue channel becoming 1 is often normal, since you need floating point texture, and any pixel with distance > 1 unit will have blue saturated.

Thanks J

I don’t understand the notation with and underscore though, eg ‘tP._11’

Ah, is it a way of accessing a row/column entry in the matrix?

Yes exactly.

nice did’t know that