Hi, I'm new to OpenGL and shaders. Also, my math is poor so matrixes are throwing me for a loop.
I'm trying to understand them while working through this task, and bits and pieces are coming together, but it's too many things coming at once so I'm getting confused.
I'm trying to take one grayscale sf::Texture, and use it as the alpha channel for another sf::Texture.
Okay, easy enough:
gl_FragColor = (texture2D(texture, gl_TexCoord[0].xy) * gl_Color);
gl_FragColor.a = texture2D(mask, gl_TextCoord[0].xy).a;
(where 'mask' is
uniform sampler2D mask; passed into the shader)
My difficulties arise because:
A) I want the textures to be able to be mirrored horizontally and/or vertically.
B) I want the textures to be able to be rotated (the mask rotated relative to the main texture, and the mask only rotated in 90 degree increments)
C) I want the mask to stretch itself to conform to the size of the texture if the mask happens to be larger or smaller or of unequal width/height.
D) I want the two textures to be rotated
seperately from each other.
For the sake of convenience and experimentation, I'm doing this in a C++ function called 'SfmlDrawTextureWithMask()', where I pass in the textures separately with their rotations and other settings.
Right now, I'm playing with a shader like this: (SFML's basic vertex shader example)
uniform mat4 textureOrientation;
//uniform mat4 maskOrientation;
void main()
{
// transform the vertex position
gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex;
// transform the texture coordinates
gl_TexCoord[0] = gl_TextureMatrix[0] * textureOrientation * gl_MultiTexCoord0;
// forward the vertex color
gl_FrontColor = gl_Color;
}
My problems so far, are that:
A) 'textureOrientation' isn't stretching to fill the texture, it's sampling outside of the texture and just has the usual bands of the border pixels:
I realize that this is because I'm rotating the texture coordinates and not the vertices themselves, and that I can use sf::Transform to rotate the primary sf::Texture by passing it into the secondary parameter of sf::RenderTarget::draw() - still, since I'm wanting to rotate the mask texture separately, I'm not sure exactly how to compensate for the rotation - I guess I could pass in the mask texture's size, use it to normalize the mask's texture coordinates, and then re-adjust with width and height swapped?
B) I don't know how
gl_TexCoord[0] is created or passed to the fragment shader. It looks like an array (because of my C++ background), but I don't know the size of the array, so I don't know if that means I can just start using gl_TexCoord[1] to hold my mask texture's texture coordinates or if that'll overflow the buffer.
Why does gl_TexCoord[] and gl_MultiTexCoord0 exist? That seems to me like OpenGL supports binding multiple textures at once, but SFML only publicly exposes binding one texture at a time. Am I way off track here?
Is gl_TexCoord[1] available for me to use, and if so, do I have to do anything special to 'enable' it? I mean, I can't just use
gl_TexCoord[999] can I?
Or to clarify my question: How do I pass my mask's separate texture coordinates from the vertex shader to the fragment shader (keeping in mind that they need to be interpolated for each fragment)?
And how do I get my mask's separate texture coordinates into the vertex shader in the first place? Especially considering that the mask might be only a subrect of a large texture container multiple masks. The first texture's tex coords are stored in gl_MultiTexCoord0, right? And they are passed into SFML using the attributes of the vertices - but a secondary mask texture wouldn't have vertices. How can I use SFML (or failing that, OpenGL) to put my mask texture's texcoords into gl_MultiTexCoord1?
Please keep your answers simple! I'm new to GLSL and OpenGL.