SFML community forums

Help => System => Topic started by: mateandmetal on March 11, 2013, 11:50:55 am

Title: String doesn't display accents under Linux
Post by: mateandmetal on March 11, 2013, 11:50:55 am
Hi... I'm trying to display spanish strings with accents under Linux.. with no luck  :(

My GCC version is 4.7.2
Recent SFML 2.0 (compiled by me)
Linux Fedora 18 x64
Codelite IDE

Here is my minimal example:
#include <iostream>
#include <string>
#include <SFML/Graphics.hpp>

using std::cout;
using std::endl;


int main() {


    // Create the main window
    sf::RenderWindow window (sf::VideoMode (800, 600), "String Test");

    // Load font
    sf::Font myFont;
    if (!myFont.loadFromFile("./fuente.ttf")) {
        cout << "error loading font" << endl;
        return -1;
    }


        // Wide string
        const wchar_t *myStringW = L"wide string: áéíóú";
        cout << myStringW << endl;
       
        sf::Text myTextW (myStringW, myFont, 50);
        myTextW.setPosition(20, 100);

        // ANSI string
        const char *myANSIstring = "ANSI string: áéíóú";
        cout << myANSIstring << endl;
       
        sf::Text myANSItext (myANSIstring, myFont, 50);
        myANSItext.setPosition(20, 150);


    // Start the game loop
    while (window.isOpen()) {

        // Process events
        sf::Event event;
        while (window.pollEvent(event)) {

            // Close window : exit
            if (event.type == sf::Event::Closed)
                 window.close();
        }

        // Clear screen
        window.clear();

        // Draw
        window.draw(myTextW);
        window.draw(myANSItext);

        // Update the window
        window.display();

    }

    return EXIT_SUCCESS;

}
 

The wide string displays correctly using sf::Text. The ANSI string does not display accents.
Should I always use wide strings under Linux?


[attachment deleted by admin]
Title: Re: String doesn't display accents under Linux
Post by: Laurent on March 11, 2013, 12:12:49 pm
Quote
Should I always use wide strings under Linux?
If you want to avoid problems related to the various encodings involved (source file encoding, compiler encoding, current locale encoding, ...), then yes, definitely.
Title: Re: String doesn't display accents under Linux
Post by: mateandmetal on March 12, 2013, 01:42:43 am
Same behavior under win7 with MinGW. The wide string works, the ANSI one not  ::)
Thanks Laurent
Title: AW: String doesn't display accents under Linux
Post by: eXpl0it3r on March 12, 2013, 07:09:42 am
What's the encoding of your file and what's the language you've serup your OSs with?

Personally I wouldn't expect the ANSI (couldn't it be also just ASCII?) version to work. ;)
Title: Re: String doesn't display accents under Linux
Post by: Laurent on March 12, 2013, 07:46:43 am
There are some limitations with MinGW, it can only use the current globale. And I think that by default, the current globale is the "C" one (which knows only about ASCII characters).

Try the same but change the global locale before,

std::locale::global(std::locale(""));
Title: Re: String doesn't display accents under Linux
Post by: mateandmetal on March 14, 2013, 12:20:51 am
My goal is to support:
- Windows and Linux platforms
- English and Spanish languages (maybe more)

Testing the same code, Win7 x64 SP1, Region: Spanish (Argentina), IDE: Codeblocks 12, file encoding: UTF-8 (if I leave the default codeblocks encoding, I can´t compile the code)
Inserting the code posted by Laurent at the beginning of the main function

(http://s24.postimage.org/eb2blzxet/win7_utf8_es_AR_locale.png)


Something interesting I found:
std::wstring VS std::string (http://stackoverflow.com/questions/402283/stdwstring-vs-stdstring)

Quote
1. When I should use std::wstring over std::string?

On Linux? Almost never
:o

Quote
On Windows? Almost always
:o

I know there is a lot of info about encodings. I try to understand but its very confusing  ???
Should I use wide strings and wide file streams to read them from files?
What kind of encoding should I use for my source code files?
Title: Re: AW: String doesn't display accents under Linux
Post by: mateandmetal on March 14, 2013, 12:36:33 am
Personally I wouldn't expect the ANSI (couldn't it be also just ASCII?) version to work. ;)

Well, using this code (same Win7 OS):
    // Create the main window
    sf::RenderWindow window (sf::VideoMode (800, 600), "String Test");

    // Load font
    sf::Font myFont;
    if (!myFont.loadFromFile("./steelfish_rg.ttf")) {
        cout << "error loading font" << endl;
        return -1;
    }

    // ANSI
    const char *myANSIstring = "Hola áéíóú";
    sf::Text myANSItext (myANSIstring, myFont, 50);

    // Start the game loop
    while (window.isOpen()) {

        // Process events
        sf::Event event;
        while (window.pollEvent(event)) {

            // Close window : exit
            if (event.type == sf::Event::Closed)
                 window.close();
        }

        // Clear screen
        window.clear();

        // Draw
                window.draw(myANSItext);

        // Update the window
        window.display();

    }

    return EXIT_SUCCESS;
 

with codeblocks default system native encoding:
(http://s1.postimage.org/6lujupsin/ansi.png)

works fine  :o
Title: Re: String doesn't display accents under Linux
Post by: mateandmetal on March 17, 2013, 03:41:22 pm
Quote
Portability, cross-platform interoperability and simplicity are more important than interoperability with existing platform APIs. So, the best approach is to use UTF-8 narrow strings everywhere and convert them back and forth on Windows before calling APIs that accept strings.

UTF-8 Everywhere (http://www.utf8everywhere.org/)

I'm using UTF-8 encoding under Linux, however the SFML text doesn't work as expected:

#include <iostream>
#include <string>
#include <locale>

#include <SFML/Graphics.hpp>

using std::cout;
using std::endl;

int main (int argc, char **argv)
{
        std::locale myLocale ("spanish"); // spanish
        std::locale::global (myLocale); // needed for console output only???
        cout << "Locale name = " << myLocale.name() << endl;

        const char *myANSI = "Hola áéíóú";
        cout << "C style ANSI string = " << myANSI << endl;    
       
        sf::String mySFML_String (myANSI, myLocale);
        std::string standardString = mySFML_String.toAnsiString();
       
        cout << "From SFML to std::string = " << standardString.c_str() << endl;
       
        // Graphics Mode xD
        sf::RenderWindow window (sf::VideoMode(600, 400), "Hello Strings!");
       
        sf::Font myFont;
        if (!myFont.loadFromFile("./DYST.ttf")) {
                cout << "Error loading fntfile" << endl;
                return -1;
        }
       
        sf::Text myText (mySFML_String, myFont, 50);
       
        // Loop
        while (window.isOpen()) {
               
                sf::Event myEvents;
                while (window.pollEvent(myEvents)) {
                        if (myEvents.type == sf::Event::Closed) {
                                window.close();
                        }
                } // pollEvent
               
                window.clear();
                window.draw(myText);
                window.display();
               
        } // isOpen
       
       
        return 0;
}
 


Screenshot:
(http://s17.postimage.org/cfudnf633/accents_not_working_linux.png)
(Font file supports accents)

I don't know what else should I do...  :-\
Please help
Title: Re: String doesn't display accents under Linux
Post by: Laurent on March 17, 2013, 04:06:03 pm
Since your environment is configured to use UTF-8, your literal string is UTF-8 and you must therefore use the corresponding function to convert it to a sf::String (see the sf::Utf8 class), using the "spanish" locale is wrong.
Title: Re: String doesn't display accents under Linux
Post by: mateandmetal on March 18, 2013, 10:40:10 am
Ok, let me see if I understand

- Should I call Utf8::decode on each character of the ANSI string?
- "Iterator pointing to the beginning of the input sequence" means something like std::string::begin?
- Is this a heavy process?
- Do I need to get/change the C/C++ locale?
Title: Re: String doesn't display accents under Linux
Post by: Laurent on March 18, 2013, 11:04:55 am
std::string utf8str = ...;
std::basic_string<sf::Uint32> utf32str;
sf::Utf8::toUtf32(utf8str.begin(), utf8str.end(), std::back_inserter(utf32str));
sf::String sfstr = utf32str;

Note that this code is not optimized: it doesn't  preallocate the memory, and uses an extra UTF-32 string rather than filling the sf::String directly. I'd have to tweak sf::String to allow a more optimized code.

And no, it's not heavy.

But if you're only using literal strings, why do you make it so complicated? Just use wide strings.
Title: Re: String doesn't display accents under Linux
Post by: mateandmetal on March 19, 2013, 02:24:39 am
Thanks for the example code, I will test it.

But if you're only using literal strings, why do you make it so complicated? Just use wide strings.

Because I thought that it was not necessary (at least under Linux)
Ok, thanks again Laurent!
Title: Re: String doesn't display accents under Linux
Post by: Laurent on March 19, 2013, 06:35:50 am
Nothing is mandatory, but using Unicode directly rather than dealing with locales is always much easier.