Hey everyone,
I am in the process of creating a text-input field for my game, and I am saving text value in a sf::Unicode::UTF32String and displaying it using sf::String. I am currently running SFML 1.6 because I like stable releases. On to my problem:
It works fine as long as I type in "normal" characters: ABCD123$§®üåäö but on my Mac keyboard I can type some of the more esoteric characters: "†ƒfi", and they are not displayed. I know this is REALLY REALLY picky and nobody will probably ever use them, and whatnot, but the bug is as I see it that I am using a UTF32-string which AFAIK should be able to contain every possible character known to man, but it it isn't. The code needed to recreate this problem:
#include <SFML/Graphics.hpp>
int main()
{
// Create main window
sf::RenderWindow App(sf::VideoMode(640, 480), "SFML Graphics");
// The unicode string and sprite
sf::String sprite("Type... ");
sf::Unicode::UTF32String str;
while (App.IsOpened())
{
sf::Event Event;
while (App.GetEvent(Event))
{
if (Event.Type == sf::Event::Closed)
App.Close();
// Text entered
if (Event.Type == sf::Event::TextEntered)
{
str += Event.Text.Unicode;
sprite.SetText(str);
}
}
App.Clear();
// Draw the text
App.Draw(sprite);
App.Display();
}
return EXIT_SUCCESS;
}
Why do you think this is? Is this a bug in my code or a limitation with SFML/C++? I get the same results if I add "setlocale(LC_ALL, "")".
And the problem is not really that I think I need those characters, just that if I am using an UTF32-string, shouldn't understand them?