Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: std::bad_alloc is thrown by sf::Font::findGlyphRect in specific circumstances  (Read 2901 times)

0 Members and 1 Guest are viewing this topic.

typeman

  • Newbie
  • *
  • Posts: 1
    • View Profile
Trying to use sf::Text to render a specific character using a specific font, specific character size, and outline thickness will cause the program to allocate hundreds of MBs of memory, and finally throw an std::bad_alloc exception. The font I'm using is GNU Unifont, available at http://unifoundry.com/unifont.html (version is 10.0.04), with size 16 and outline of 2.0. The character that causes the crash is 'S'. Lowercase works normally. The font itself is not the problem, as far as I'm aware; the Windows font preview handles it properly, and BMFont is able to render it as well.

The weird thing is that changing either the size or the outline thickness causes the program to allocate normally (at least for 'S'; different settings may cause other glyphs to have issues, I'm not sure).

First, my setup:
OS: Windows 10
GPU: nVIDIA GTX 960
Compiler: Visual C++ 2017
SFML: 2.4.2 unmodified, dynamically linked, self-compiled

Minimal example program below:

#pragma comment(lib, "sfml-system-d.lib")
#pragma comment(lib, "sfml-audio-d.lib")
#pragma comment(lib, "sfml-window-d.lib")
#pragma comment(lib, "sfml-graphics-d.lib")

#include <SFML/Window.hpp>
#include <SFML/Graphics.hpp>

int main()
{
        sf::RenderWindow wnd;
        wnd.create(sf::VideoMode(800, 600), "Test Window");

        sf::Font unifont;
        unifont.loadFromFile("unifont-10.0.04.ttf");

        sf::Text text;
        text.setFont(unifont);
        text.setString("S");
        text.setOutlineThickness(2.0f);
        text.setCharacterSize(16);
        text.setPosition(2.0f, 2.0f);

        while (wnd.isOpen())
        {
                sf::Event event;
                while (wnd.pollEvent(event))
                        if (event.type == sf::Event::Closed)
                                wnd.close();

                wnd.clear(sf::Color(128, 128, 128));
                wnd.draw(text); // This line throws, when sf::Text::draw tries to get the glyphs from sf::Font
                wnd.display();
        }

        return 0;
}

Doing some poking around, it looks like the line causing the invalid allocation is:

newImage.create(textureWidth * 2, textureHeight * 2, Color(255, 255, 255, 0));

in SFML's Font.cpp, in sf::Font::findGlyphRect. It looks to me like findGlyphRect is being passed invalid width and height parameters (in my case they were over 0x7000000), and as a result the while loop which doubles the font atlas texture's size gets stuck and continues doubling (and allocating larger and larger std::vector<uint8>s) until the allocator fails.

Tracing back further, in Font.cpp's sf::Font::loadGlyph:

// Convert the glyph to a bitmap (i.e. rasterize it)
FT_Glyph_To_Bitmap(&glyphDesc, FT_RENDER_MODE_NORMAL, 0, 1);
FT_Bitmap& bitmap = reinterpret_cast<FT_BitmapGlyph>(glyphDesc)->bitmap;
// ==bitmap now contains invalid values==

[...]

int width  = bitmap.width;
int height = bitmap.rows;
// ==width and height now contain invalid values==

I'm not familiar enough with Freetype to discern whether this is an SFML bug or a Freetype bug (or indeed user error), so I'm hesitant to submit a Github issue.

Let me know if I've left out anything that would be helpful.
« Last Edit: July 10, 2017, 03:59:54 am by typeman »

 

anything