Hey guys,
I’m having some trouble attempting to render the depth buffer of a scene. It appears as though it is still just rendering the color buffer, and I can’t figure out why that is. I’m using the latest windows release of Panda3d.
Any ideas? Thanks!
Python:
...
manager = FilterManager(base.win,base.cam)
tex = Texture()
dtex = Texture()
quad = manager.renderSceneInto(colortex=tex,depthtex=dtex)
quad.setShader(Shader.load("depthviz.sha"))
quad.setShaderInput("dtex",dtex)
...
Shader:
void vshader(
float4 vtx_position : POSITION,
float2 vtx_texcoord0: TEXCOORD0,
uniform float4 texpad_dtex,
out float4 l_position : POSITION,
out float2 l_texcoord0: TEXCOORD0,
uniform float4x4 mat_modelproj)
{
l_position=mul(mat_modelproj, vtx_position);
l_texcoord0 = vtx_position.xz * texpad_dtex.xy + texpad_dtex.xy;
}
void fshader(
float2 l_texcoord0: TEXCOORD0,
uniform sampler2D tex_dtex,
out float4 o_color : COLOR)
{
float4 res = tex2D(tex_dtex,l_texcoord0);
o_color = res;
}
You have to set the filters, as in the shadow sample :
dtex.setMinfilter(Texture.FTShadow)
dtex.setMagfilter(Texture.FTShadow)
Thanks for the quick response. However, doing what you suggested doesn’t seem to help. I have included all the code, is there anything else that I am missing?
import direct.directbase.DirectStart
from direct.filter.FilterManager import FilterManager
from pandac.PandaModules import *
manager = FilterManager(base.win,base.cam)
tex = Texture()
dtex = Texture()
dtex.setMinfilter(Texture.FTShadow)
dtex.setMagfilter(Texture.FTShadow)
quad = manager.renderSceneInto(colortex=tex,depthtex=dtex)
quad.setShader(Shader.load("depthviz.sha"))
quad.setShaderInput("dtex",dtex)
# camera setup
base.cam.setPos(0,5,2)
base.cam.lookAt(Point3(0,0,0),Vec3(0,0,1))
monkey = loader.loadModel('monkey')
monkey.reparentTo(render)
base.disableMouse()
run()
Eh, sorry, I didn’t think. It has nothing to do with shadow.
It’s the format :
dtex.setFormat(Texture.FDepthComponent)
Okay, again I tried that, but still no luck. Browsing through the source and docs, the texture format should be automatically assigned through the
addRenderTexture
call done through the
FilterManager::renderSceneInto
function. Is anyone able to set up a quick simple program that renders the depth buffer to the screen, using the FilterManager?
Ok I found the issue and resolved it.
In the fragment shader I had this parameter
uniform sampler2D tex_dtex
which should have been
uniform sampler2D tex_1
. Then in the program I manually added the depth texture to the quad.
My problem was that I assumed that
quad.setShaderInput("dtex",dtex)
passed the texture to the shader, which I don’t think it was. panda3d.org/manual/index.php/Gener … ge_Filters says something along these lines, so it may be an issue with the filter shader generator or still my understanding.