Hello,
I think I have run into a bug, or at the very least an inconsistency in the behaviour.
The scenenario is the following:
- I have a sprite, with some kind of drawing.
- I want to use a shader on it, using another texture as a mask
I was not able to make it work correctly, and the mask texture seemed to be upside down. I though that this was because I forgot to use the .display() function on my rendertexture after drawing the mask on it. That was not the case, and decided to construct a minimal example, and I found out the following behaviour:
When you pass a texture to a shader, the coordinate system is different depending on the texture loaded:
- If the texture was loaded from a file, the top-left corner of the texture is the point (0,0) and the bottom-right corner is the point (1,1).
- If the texture is from a renderTexture, the bottom-left corner of the texture is the point (0,0) and the top-right corner is the point (1,1).
This means that one of the coordinate system is upside down.
Since this example needs of several files (the code, the shader and a couple of images) I have packed them into a file and attached it to the post.
Here is what is displayed in the example:
1. The target sprite
2. The renderTexture mask
3. The renderTexture's texture drawn over the target sprite
4. The texture loaded from a file
5. The texture drawn over the target sprite
Is this behaviour expected? Should one of the coordinate system be changed to be consistent between each other?
I hope the example is clear enough. If you have any question about it, please ask so I can clarify it.
[attachment deleted by admin]