Hi!
I am playing around ray marching (oldschool, not sphere tracing!) inside a 3d simplex noise field.
This works as expected. Though when I fill a texture with the coordinates of each pixel
and have a second, different shader do the normal calculation, I get this:
The GLSL code is simply sampling surrounding points of the noise to gather enough gradients to create a normal, like usual, which I then use to calculate the diffuse value. For that, all I need is the coordinate of the marched noise-pixel in 3d space.
That’s what I pass forward into the next shader.
The calculations are exactly the same, yet I get a different result, which means I am possibly doing
something wrong when I create the output-texture for the first shader.
Any hints?
Here’s the python code:
manager = FilterManager(base.win, base.cam2d)
tex = Texture()
self.quad = manager.renderSceneInto(colortex=tex)
self.quad.setShader(Shader.load(Shader.SL_GLSL, vertex="filter_v.glsl", fragment="filter_f.glsl"))
cm = CardMaker("TheWorldonTwoTriangles")
cm.setFrameFullscreenQuad()
self.Planes = NodePath(cm.generate())
self.Planes.setShader(Shader.load(Shader.SL_GLSL, vertex="v.glsl", fragment="f.glsl"))
self.Planes.setHpr(0,0,0)
self.Planes.setBillboardPointEye()
self.Planes.reparentTo(render2d)
Please note that the images above are downsampled from 1600x1000.
Thanks!