SFML community forums

Help => Graphics => Topic started by: wademcgillis on August 31, 2009, 04:10:38 am

Title: Converting Image to GLuint (a texture)
Post by: wademcgillis on August 31, 2009, 04:10:38 am
Code: [Select]

GLuint ScreenImage()
    {
    sf::Image img;
    img.CopyScreen(App,sf::IntRect(0,0,512,256));
    const sf::Uint8* ptr = img.GetPixelsPtr();
    GLuint tex;
    glGenTextures(1, &tex);
    glBindTexture(GL_TEXTURE_2D, tex);
    glTexImage2D(GL_TEXTURE_2D,0,GL_RGBA,512,256,0,GL_RGBA,GL_UNSIGNED_INT,ptr);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MIN_FILTER,GL_LINEAR);
    glTexParameteri(GL_TEXTURE_2D,GL_TEXTURE_MAG_FILTER,GL_LINEAR);
    glBindTexture(GL_TEXTURE_2D, 0);
    return tex;
    }

what am I doing wrong?
Title: Converting Image to GLuint (a texture)
Post by: Laurent on August 31, 2009, 08:20:19 am
Quote
what am I doing wrong?

GL_UNSIGNED_INT should be GL_UNSIGNED_BYTE, so that you don't depend on the machine's endianness.

The result may also depend on where in your main loop you call this function.

Giving a little more informations could be useful (at least, what do you get with your current code?).
Title: Converting Image to GLuint (a texture)
Post by: wademcgillis on August 31, 2009, 03:47:59 pm
The program crashes.
Title: Converting Image to GLuint (a texture)
Post by: Laurent on August 31, 2009, 03:54:36 pm
What does the debugger say?
Title: Converting Image to GLuint (a texture)
Post by: wademcgillis on August 31, 2009, 07:52:08 pm
GL_UNSIGNED_BYTE solved the problem. It works fine now. I put GL_UNSIGNED_INT because of Uint8*.
Title: Converting Image to GLuint (a texture)
Post by: Nexus on September 01, 2009, 01:11:00 pm
Quote from: "wademcgillis"
It works fine now. I put GL_UNSIGNED_INT because of Uint8*.
And guess how many bits a byte contains... ;)
Title: Converting Image to GLuint (a texture)
Post by: wademcgillis on September 01, 2009, 04:14:17 pm
Byte: 8
Short: 16
Float: 32
Int: 32?
Title: Converting Image to GLuint (a texture)
Post by: Laurent on September 01, 2009, 04:22:20 pm
I think that OpenGL constants for types do not make any assumption about their size, they just map to the system's native types. So GL_UNSIGNED_INT could be either 32 bits or 64 bits depending on the target machine.