See below for a very simplified mockup of my code.
The issue is when I run the code with a computer that only has OpenGL 1.1.
Running the app via Remote Desktop exhibits this behavior because RDP uses a OpenGL 1.1 lib.
As you can see in the attached screenshot "screenshot-GOOD.png", the blue rect displays correctly in the upper left corner of the window when run on session 0 with OpenGL > 1.1.
The next screenshot ("screenshot-RDP.png") is the result of me running the code in a remote desktop session. The blue rect is positioned incorrectly, centered vertically in the window.
It's as if the offscreen rect coordinate system is different when opengl 1.1 is used.
Maybe there is an obvious answer to this problem, damned if I can't pin point what the heck is going on.
Anyone have any ideas?
I don't care about RDP really, but the hangup is on my co-developer's laptop, which only has 1.1, and exhibits the same behavior. The game is just a text game, so we don't need any kind of OpenGL-accelerated graphics. However, we have several TextArea sub-windows (the blue rect), whose positions are all skewed on his laptop, and messes up the proper display of our text, making development very difficult for him, to say the least :-).
One more detail: I get a bunch of errors/warnings in RDP in the console window when in debug mode.
An internal OpenGL call failed in Texture.cpp (146) : GL_INVALID_ENUM, an unacceptable value has been specified for an e
numerated argument
An internal OpenGL call failed in Texture.cpp (147) : GL_INVALID_ENUM, an unacceptable value has been specified for an e
numerated argument
Another forum posting states this is because GL_CLAMP_TO_EDGE is not supported in OpenGL 1.1. However, it seems this is simply a warning, and I'm not sure if that is directly related to the offset rect issue I'm experiencing.
Thanks ahead of time for any advice!
#include "SFML/Graphics.hpp"
#include "SFML/Window.hpp"
#include <iostream>
int main(int argc, char** argv)
{
sf::RenderWindow win;
win.create(sf::VideoMode(1000, 500), "My Window", sf::Style::Close);
sf::ContextSettings settings = win.getSettings();
std::cout << "OpenGL Version: " << settings.majorVersion << "." << settings.minorVersion << std::endl;
sf::Font m_font;
if (!m_font.loadFromFile("LiberationMono-Regular.ttf"))
{
return 1;
}
sf::Text m_text;
m_text.setString("Some String");
m_text.setFont(m_font);
m_text.setCharacterSize(15);
sf::IntRect m_rect(0, 0, 800, 350);
sf::RenderTexture m_offscreenRect;
m_offscreenRect.create(m_rect.width, m_rect.height);
m_offscreenRect.clear(sf::Color::Blue);
m_offscreenRect.draw(m_text);
m_offscreenRect.display();
sf::Sprite m_sprite;
m_sprite.setTexture(m_offscreenRect.getTexture());
// Main event loop
while (win.isOpen())
{
// Process events
sf::Event event;
while (win.pollEvent(event))
{
switch (event.type)
{
case sf::Event::Closed:
// Window was closed, exit game
win.close();
break;
}
}
win.clear();
win.draw(m_sprite);
win.display();
}
return 0;
}
Hot damn, I figured it out!
I actually think this is a bug in the SFML source, but please verify!
SGML source, V. 2.1, Texture.cpp, line 473:
matrix[13] = static_cast<float>(texture->m_size.y / texture->m_actualSize.y);
The above line integerizes the ratio of size to actualSize. The result is cast as a float, but since both the operands are integers, it is guaranteed never to result in anything but a integer cast as a float.
For some reason, when I use OpenGL 1.1, the actualSize is always a power of two, which usually ends up being larger than the size. For example, if I use a size of 30 x 30, the actualSize is 32 x 32. Similarly, 200 x 200 has an actualSize of 256 x 256. I changed the above line to:
matrix[13] = static_cast<float>(texture->m_size.y) / static_cast<float>(texture->m_actualSize.y);
and everything displays flawlessly!
I'm all set on my end -- I've recompiled the SFML code with the above change. If this turns out to be a bug, let me know if you'd like me to officially file one.
Cheers,
Scott