Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Font destruction causes OpenGL error when Text is drawn in separate thread  (Read 5844 times)

0 Members and 1 Guest are viewing this topic.

Hapax

  • Hero Member
  • *****
  • Posts: 3351
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
If a text is drawn to a window in a separate thread, an OpenGL error is reported when the font is destroyed.
The OpenGL error is caused by SFML's Texture.cpp (when sf::Font is destroyed).

Note that, even if the thread is completely finished and destroyed, the font destruction still causes the error.

The error:


The line to which is referred in Texture.cpp:
glCheck(glDeleteTextures(1, &texture));
https://github.com/SFML/SFML/blob/2.4.1/src/SFML/Graphics/Texture.cpp#L104

A complete and minimal example that causes the error:
#include <SFML/Graphics.hpp>
#include <thread>

void threadFunction(const sf::Text& text)
{
    sf::RenderWindow window(sf::VideoMode(800, 600), "");
    window.draw(text);
}

int main()
{
    {
        sf::Font font;
        font.loadFromFile("resources/fonts/arial.ttf");
        sf::Text text("Text", font);
        //text.getLocalBounds(); // uncomment to stop error
        {
            std::thread thread(threadFunction, text);
            thread.join();
        } // thread completed and destroyed here
    } // font destroyed here - OpenGL error
    sf::sleep(sf::seconds(1.f));
    return EXIT_SUCCESS;
}

Any ideas as to why this happens? Any solutions?
Should the font be so seemingly linked to the window?

The strange thing - as you may have noticed in the code - is that if you call getLocalBounds() on the text object (and subsequently ensureGeometryUpdate), this error no longer occurs. This, could be considered a workaround (or "solution") but I'm still curious as to what causes the problem in the first place.

Note that this exact example was written and tested (and screenshot) using v2.4.1. However, the problems also occur with v2.4.0 and v2.4.2.
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

FRex

  • Hero Member
  • *****
  • Posts: 1845
  • Back to C++ gamedev with SFML in May 2023
    • View Profile
    • Email
(This theory is totally incorrect, something else happens, something windows specific.)

I'm really not sure of the details and it's too late for me to read into the code right or search around right now but I think this is about right. ;D

I think it has to do with spooky action at a distance between contexts and threads.
Another 'solution' is just creating an sf::Context in first line of your main and letting it stay there, thus creating the context explicitly in that thread. Creating a thread that just runs forever and only creates an sf::Context would work too actually (new an std::thread and give it a function that only created a local sf::Context on the stack and does while(1);).

Contexts are per thread and fonts are loaded into textures lazily. If you call get local bounds it'll end up building text vertices and loading the glyphs to get their sizes, kernings and so on. That'd cause a context to be implicitly created behind the scenes in the main thread.

If you don't do that then the first call that will cause textures to be loaded will be the draw call in the thread, context that it happens in is created and destroyed by the sf::RenderWindow (or rather it's base sf::Window part) in the thread.

So I'm not sure if this is a thing.. the texture (the GL one referred to by that unsigned) that sf::Texture in font pages hold gets sort of orphaned..?

I think the problem is you reach 0 contexts between creating the texture and destroying it.

Without your fix it's like: context in thread is made (1), texture is created, context in thread is destroyed (0), texture is destroyed (so an implicit context is made but we reached 0 so all GL stuff was thrown away so this unsigned ID is not valid).

With the fix it goes like: context in main (1), texture is created, context in thread is created (2), context in thread is destroyed(1), texture is destroyed (there was never 0 contexts between here and creation of it so ID is valid still), context in the main gets destroyed (0).

Similarly with a runaway thread that just creates a context or with an explicit context created in main. The point is to not get to total GL shutdown (0 contexts) and orphaning resource ids because then C++ classes dtors will try freeing them in a fresh context, unrelated to all the previous ones.

I recall there was an overhaul of context handling a while back too. Before that I think there was a global hidden context that kinda leaked, and that prevented problems like these. But I'm also not sure on that right now. Maybe you could find something on the forums.

There is some mention of context and thread management in here: https://www.sfml-dev.org/tutorials/2.4/window-opengl.php

This might help understanding threading + GL contexts: https://developer.apple.com/library/content/documentation/GraphicsImaging/Conceptual/OpenGL-MacProgGuide/opengl_threading/opengl_threading.html

And I recall that it often gets problematic, especially if stuff starts flying around different threads, like I'm not sure how relevant this is in Win10 era (I doubt multi threaded GL that they labeled legacy years ago is some sort of a holy grail for Microsoft programmers in face of Vulkan and DX11 and 12 though..), this is from a 5 years old interview:
Quote
John Carmack - This was explicitly to support dual processor systems. It worked well on my dev system, but it never seemed stable enough in broad use, so we backed off from it. Interestingly, we only just found out last year why it was problematic (the same thing applied to Rage’s r_useSMP option, which we had to disable on the PC) – on windows, OpenGL can only safely draw to a window that was created by the same thread. We created the window on the launch thread, but then did all the rendering on a separate render thread. It would be nice if doing this just failed with a clear error, but instead it works on some systems and randomly fails on others for no apparent reason.
« Last Edit: June 18, 2017, 12:34:47 am by FRex »
Back to C++ gamedev with SFML in May 2023

Hapax

  • Hero Member
  • *****
  • Posts: 3351
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
I can follow the logic you described; it makes sense.

The confusing part to the SFML user is that the sf::Font can be legally in a no-context-state and still be destroyed correctly. Only after its usage (drawn to a text) does it then complain about not having a context. Do you think it would it be possible for sf::Font to remove its requirement for a context (whether that be removing a stored texture or whatever it is) when it realises it no longer has any contexts?
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

FRex

  • Hero Member
  • *****
  • Posts: 1845
  • Back to C++ gamedev with SFML in May 2023
    • View Profile
    • Email
Funnily enough there is no bug on Linux.
The description above is wrong.
It's not about orphaning the GL resource, Texture inherits from GlResource and that will keep a GlContext around. This is something with shared contexts and windows specific.

This function gives me an error "Failed to activate OpenGL context: The handle is invalid.":
https://github.com/SFML/SFML/blob/master/src/SFML/Window/Win32/WglContext.cpp#L216

There was a topic mentioning this already:
https://en.sfml-dev.org/forums/index.php?topic=21102.msg150734#msg150734

I'm still looking into it (and I'd think I have rather fresh drivers) but it's probably up to Laurent or binary1248 now.


I got latest SFML from github and I'm now working in WglContext. The issue is with a shared context that is made on a spawned thread and destroyed in the main one, it also turns out that a runaway thread doesn't work (I don't remember if I was too tired or if it really did work on the old version of SFML I had).
When wglMakeCurrent fails I try calling WindowFromDC on m_deviceContext and it returns 0x0 (so no window) despite it being created from m_window.
Calling IsWindow on m_window also returns 0 (false).

The DestroyWindow(m_window) in WglContext::~WglContext() that is only called if m_ownsWindow is true also returns 0 because it fails with "Invalid window handle.". So clearly something happens to the hidden window created by CreateWindowA in WglContext::createSurface.
« Last Edit: June 18, 2017, 12:32:52 am by FRex »
Back to C++ gamedev with SFML in May 2023

Hapax

  • Hero Member
  • *****
  • Posts: 3351
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
Ah, so even though the window is created and destroyed in the separate thread, the font (and therefore its texture) keeps open a context? Is this the same context or its own special one? And then, when the font is destroyed in the main thread, the font/texture's context that was created in the other thread becomes "shared context" a problem?

I would guess that SFML's texture being inherited from GlResource might mean there isn't anything that can be done - at least without redesigned GlResource.

Might just be worth forcing it to create the font's context in the main thread. Bit hacky but might be the only way to go.
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

FRex

  • Hero Member
  • *****
  • Posts: 1845
  • Back to C++ gamedev with SFML in May 2023
    • View Profile
    • Email
I'm not sure what it is. On Linux it works. It's something with window management on Windows, as I said above. I can't comment more about it, I'm not experienced with winapi, maybe Laurent could help if he sees this. ;D But either way it's quite a niche problem.
Back to C++ gamedev with SFML in May 2023

binary1248

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1405
  • I am awesome.
    • View Profile
    • The server that really shouldn't be running
This seems quite fishy to me...

I'm currently away on holidays right now, and only have a reliable OS (Linux) on my laptop, so I'll have to take a look at this when I get home in a week.

You didn't mention whether you are running this on an Nvidia or AMD or even Intel GPU. Nvidia is known to employ questionable "optimizations" that cause weird behaviour that we had to work around before, so I wouldn't be surprised if this was the case again. Intel used to have pretty broken drivers, but that has improved with their newer families of graphics hardware.
SFGUI # SFNUL # GLS # Wyrm <- Why do I waste my time on such a useless project? Because I am awesome (first meaning).

Hapax

  • Hero Member
  • *****
  • Posts: 3351
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
I have missed out a few details, actually; sorry about that.

Same problems on two separate devices - desktop with Windows 7 and laptop/tablet with Windows 10.
Both compiled with VS2015 using the pre-built downloadable binaries.
Happens in debug mode but does not show error in release mode (as it shouldn't since it isn't in debug mode).
Desktop uses an nVidia card and portable device uses internal (apparently) Intel card.
Consistent amongst all versions of SFML 2.4 (so far - up to patch version 2).

Anything else you want to know, feel free to ask.
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

FRex

  • Hero Member
  • *****
  • Posts: 1845
  • Back to C++ gamedev with SFML in May 2023
    • View Profile
    • Email
I'm running it on Windows 10 on an MSI laptop, 100% up to date, built from latest master from github. Also VS2015. It happened back on prebuilt too, I can't recall if runaway thread helped that though or if I was mistaken. It happens regardless of running on Intel or Nvidia (I can choose in right click menu). I have this 2 cards set up where I can pick what runs on which, etc. an integrated Intel one in my i7 and a real GTX950M Nvidia one which is a separate chip.
Back to C++ gamedev with SFML in May 2023

dabbertorres

  • Hero Member
  • *****
  • Posts: 506
    • View Profile
    • website/blog
I've noticed this as well, on Windows 7 thru 10, on Intel and Nvidia cards.

It's easy to miss or forget about it, since it most often happens (in my experience) when the process is terminating, so it's kind of a "oh well".

FRex

  • Hero Member
  • *****
  • Posts: 1845
  • Back to C++ gamedev with SFML in May 2023
    • View Profile
    • Email
It happens right when Font is destroyed, not at the end of the process. You can add while(1); before the return and it'll still error out.
Back to C++ gamedev with SFML in May 2023

dabbertorres

  • Hero Member
  • *****
  • Posts: 506
    • View Profile
    • website/blog
Of course, I was intending to give a possible reason as to why it isn't very noticeable. At least for myself!

binary1248

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1405
  • I am awesome.
    • View Profile
    • The server that really shouldn't be running
So I had a look at this problem. And as I expected it was due to the broken nature of WGL on Windows.

For some unexplained reason, if you create a context in a secondary thread, share it with another context and destroy the second context, the first context will never be activate-able again in any thread. Go figure...

You already figured out the workaround, cause the shared context to be created in the primary thread, then this problem doesn't happen. It might not be obvious when the shared context is created, but considering that almost everybody creates their main window in their primary thread this shouldn't be too big of a show stopper. This (creating window in main thread) is already a limitation that is present on OS X as documented in the "Opening and managing a SFML window" tutorial. People who write code that is meant to work on OS X as well would have inadvertently worked around this issue already, so I don't think this warrants its own workaround code on Windows.
SFGUI # SFNUL # GLS # Wyrm <- Why do I waste my time on such a useless project? Because I am awesome (first meaning).

Hapax

  • Hero Member
  • *****
  • Posts: 3351
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
Thanks for looking into why it occurs. Your description makes a lot of sense.

I found a possibly related occurrence that if you create a window fully in a separate thread (destroyed in the thread), wait for the thread to fully complete, a newly-created window in the main thread will not show. That workaround could be to create the window in the main window first or even just before the thread finishes. Just to be clear, in this case, the context isn't (clearly, at least) shared. The window is fully destroyed in the external thread and the new one is created in the main thread. However, with it being such a bad design to start with, it's not something - as you said - should be "fixed" for Windows' users.
I mention this only as a side note.

Thanks again.

I do have a simple question: is there a simple way to create a context? Does creating a render texture in the main thread just magically clear all these problems?
Note that I don't intend to use the render texture "idea"; I'm simply curious.
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

binary1248

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1405
  • I am awesome.
    • View Profile
    • The server that really shouldn't be running
I do have a simple question: is there a simple way to create a context? Does creating a render texture in the main thread just magically clear all these problems?
Note that I don't intend to use the render texture "idea"; I'm simply curious.
If you just want to have some kind of context for e.g. transferring OpenGL texture data or something in some thread there is always sf::Context... sf::Context itself is a GlResource so it will cause the shared context to be created, just like all other GlResource objects. If you are creating GlResource objects already, then they will cause the shared context to be created. sf::Font is an exception because it can own sf::Textures (which are GlResources) but only when it actually gets around to lazy loading its glyphs as you found out.
SFGUI # SFNUL # GLS # Wyrm <- Why do I waste my time on such a useless project? Because I am awesome (first meaning).