[RayMarching] Lighting works, unless deferred?

Hi! :smiley:

I am playing around ray marching (oldschool, not sphere tracing!) inside a 3d simplex noise field.

This works as expected. Though when I fill a texture with the coordinates of each pixel
and have a second, different shader do the normal calculation, I get this:

The GLSL code is simply sampling surrounding points of the noise to gather enough gradients to create a normal, like usual, which I then use to calculate the diffuse value. For that, all I need is the coordinate of the marched noise-pixel in 3d space.

That’s what I pass forward into the next shader.

The calculations are exactly the same, yet I get a different result, which means I am possibly doing
something wrong when I create the output-texture for the first shader.

Any hints?

Here’s the python code:

manager = FilterManager(base.win, base.cam2d)
tex = Texture()

self.quad = manager.renderSceneInto(colortex=tex)
self.quad.setShader(Shader.load(Shader.SL_GLSL, vertex="filter_v.glsl", fragment="filter_f.glsl"))

cm = CardMaker("TheWorldonTwoTriangles")
cm.setFrameFullscreenQuad()
self.Planes = NodePath(cm.generate())
self.Planes.setShader(Shader.load(Shader.SL_GLSL, vertex="v.glsl", fragment="f.glsl"))
self.Planes.setHpr(0,0,0)
self.Planes.setBillboardPointEye()
self.Planes.reparentTo(render2d)

Please note that the images above are downsampled from 1600x1000.

Thanks! :smiley:

Heh … just found out there’s a RenderQuadInto as well,
but when I use that I only get to see the output of the first shader.

I am doing it wroooong, but I don’t know whaaaat! :slight_smile:

The problem is that your normals are passed as 8bit, thats why it looks so “steppy”, you only have 256 different values. Either use a 16 bit texture or have a look at normal quantization.

I completely forgot about that! rgba/xyzw are eight bit per channel. -.-

Thank you so much! :smiley: