TexGen >> shader

Hey,

I wrote a tiny shader to blend some textures together using alphamaps. The shader works fine, but I have some problems with the Texture Coords Generation. I can’t get them passed to my shader. Without shaders the code works correctly. This is my code:

        self.shader=loader.loadShader('texblend.sha')
        tex0=loader.loadTexture("dirt.png")
        tex1=loader.loadTexture("fungus.png")
        ts0=TextureStage("ts0")
        ts1=TextureStage("ts1")
        self.terrain.setTexGen(ts0,TexGenAttrib.MWorldPosition)
        self.terrain.setTexScale(ts0,self.XScale**-1,self.YScale**-1)
        self.terrain.setTexGen(ts1,TexGenAttrib.MWorldPosition)
        self.terrain.setTexScale(ts1,self.XScale**-1,self.YScale**-1)
        self.terrain.setTexture(ts0,tex0,10)
        self.terrain.setTexture(ts1,tex1,20)
        self.terrain.setShader(self.shader)

No error. Just no working texture coordinates.
Without the self.terrain.setShader line it works great. With that line the model looks the same as it looks without texture coordinates. Is there any way to pass them to the shader?

By the way. I use this in my shader, which should be right

              in float2 l_texcoord0 : TEXCOORD0,

But it uses just the default texture coords. I tried everything.

Any comments/suggestions?

Just a wild guess: did you assign the vertex texture coordinates (vtx_texcoord0) to the interpolated texture coordinates (l_texcoord0) in the vertex shader part?

void vshader( in  float4 vtx_position  : POSITION,
              in  float2 vtx_texcoord0 : TEXCOORD0,
              out float2 l_texcoord0   : TEXCOORD0,
              .... )
{
    l_texcoord0 = vtx_texcoord0;
...
}

enn0x

I don’t think that’s the problem. I hit such thing before, in the shadow mapping shader. Changing tex scale in Python script and without using shader works as you wish, but when you use shader, your tex scale is ignored. So you have to scale it in the shader too. Every tiny thing like this is your own responsibility while using shader.

Right. In particular, setTexGen is your own responsibility if you write a shader, too. Using setTexGen just tells the fixed-function pipeline to implicitly generate texture coordinates, but when you use a shader, you circumvent the fixed-function pipeline. So if you want implicit texture coordinates, you have to write your shader to generate them.

David

And how is that done? I tried multiplying the texcoords both in the vertex and pixel shader, without success.

In order to generate texture coordinates in the shader, copy the vertex position to the texcoord value.

David

Thanks David, you solved it! I finally have my own blending shader!

One more question: My video card has a maximum of 4 TextureStages. But, does this maximum also apply when I just use the TextureStage to pass the textures to the Shader?
I mean, can I use more than 4 textures here?