SFML community forums
Help => Graphics => Topic started by: wademcgillis on August 31, 2009, 04:10:38 am
-
GLuint ScreenImage()
{
sf::Image img;
img.CopyScreen(App,sf::IntRect(0,0,512,256));
const sf::Uint8* ptr = img.GetPixelsPtr();
GLuint tex;
glGenTextures(1, &tex);
glBindTexture(GL_TEXTURE_2D, tex);
glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,512,256,0,GL_RGBA,GL_UNSIGNED_INT,ptr);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
glBindTexture(GL_TEXTURE_2D, 0);
return tex;
}
what am I doing wrong?
-
what am I doing wrong?
GL_UNSIGNED_INT should be GL_UNSIGNED_BYTE, so that you don't depend on the machine's endianness.
The result may also depend on where in your main loop you call this function.
Giving a little more informations could be useful (at least, what do you get with your current code?).
-
The program crashes.
-
What does the debugger say?
-
GL_UNSIGNED_BYTE solved the problem. It works fine now. I put GL_UNSIGNED_INT because of Uint8*.
-
It works fine now. I put GL_UNSIGNED_INT because of Uint8*.
And guess how many bits a byte contains... ;)
-
Byte: 8
Short: 16
Float: 32
Int: 32?
-
I think that OpenGL constants for types do not make any assumption about their size, they just map to the system's native types. So GL_UNSIGNED_INT could be either 32 bits or 64 bits depending on the target machine.