Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: TextEvent.Unicode not using Encoding  (Read 4493 times)

0 Members and 1 Guest are viewing this topic.

Tank

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1486
    • View Profile
    • Blog
    • Email
TextEvent.Unicode not using Encoding
« on: February 14, 2009, 03:00:52 am »
Hey,

I didn't really know where to put this thread in. But since it's event-related, I'll give it a go here.

The problem I have is that the Unicode member of TextEvent is not encoded, but just holding the pure Unicode value of the entered character. This is rather useless for use with SFML, since it uses UTF-32 everywhere. I couldn't find a helper method to encode a Unicode value to UTF-32, which makes it impossible to get the real character using only SFML.

I checked the sources and found out that you're doing an UTF8 to UTF32, so I thought this should work in general, but it doesn't. For example, if I hit 'ß' on my keyboard, TextEvent.Unicode is 223 or 0x00DF, which is the Unicode representation of 'ß', thus not encoded.

Any hints on this?

Laurent

  • Administrator
  • Hero Member
  • *****
  • Posts: 32498
    • View Profile
    • SFML's website
    • Email
TextEvent.Unicode not using Encoding
« Reply #1 on: February 14, 2009, 10:52:26 am »
UTF-32 is the encoding which allows to represent every single Unicode codepoint directly. So... it's just ok.

Why did you think it was wrong? and what did you expect?
Laurent Gomila - SFML developer

Tank

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1486
    • View Profile
    • Blog
    • Email
TextEvent.Unicode not using Encoding
« Reply #2 on: February 14, 2009, 02:24:51 pm »
You're right. The whole Unicode theory is a bit confusing. ;)

I guess a problem could be that either my code is just wrong, or my terminal is not displaying the proper character because it's using the UTF-8 encoding.

Btw, how about implementing an Append() function for sf::String (or at least sf::Unicode::Text) to append single UTF-32 characters? This would make my workaround (which is declaring a char[5], copying the Uint32 into, set [4] to \0) needless and SFML a bit more comfortable. Else the TextEntered event is a bit inconvenient to handle.

Laurent

  • Administrator
  • Hero Member
  • *****
  • Posts: 32498
    • View Profile
    • SFML's website
    • Email
TextEvent.Unicode not using Encoding
« Reply #3 on: February 14, 2009, 02:27:30 pm »
Why don't you use a sf::Unicode::UTF32String? It's the same as std::string except that it's storing Uint32 characters. Then you can give it directly to your sf::String.
Laurent Gomila - SFML developer

Tank

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1486
    • View Profile
    • Blog
    • Email
TextEvent.Unicode not using Encoding
« Reply #4 on: February 14, 2009, 07:10:38 pm »
Okay, I spent about 2 hours figuring out how the Unicode implementation works in SFML...

Take a look at these lines of code:
Code: [Select]
void DebugState::HandleConsoleCommand( const sf::Unicode::UTF32String &cmd ) {
if( cmd == static_cast<sf::Unicode::UTF32String>( sf::Unicode::Text( L"quit" ) ) ) {
EmitSignal( L"QuitGame" );
}
}


I think the long line should be self-explanatory. Handling Unicode strings is really a pain. When you take into account that I just wanted to append an UTF32 character(!) from TextEvent.Unicode to an UTF32 string, then you see that it ended in a horrible nightmare. ;)

My opinion is that SFML should provide a general class that handles all that string stuff. Or implement at least an utility function to easily create an UTF32String from a const char* or const wchar*.

Or has anybody tips on how to append one character of type Uint32 to any of the SFML Unicode strings?!