SFML community forums

General => General discussions => Topic started by: Tank on March 13, 2012, 10:01:17 pm

Title: sf::Texture crashes when used in a static object
Post by: Tank on March 13, 2012, 10:01:17 pm
A lot of SFGUI users experience crashes when the program is exitting. This is due to sf::Texture being destroyed after sf::Context is destroyed and happens because a lock is tried to acquired on a mutex that's static and has already been destroyed, too.

binary1248 did a quick example (and also found this one out). You HAVE TO compile it with MSVC, because the crash doesn't happen with GCC (surprise, surprise ;)).

Here's binary1248's original post, I just append it here: (source: http://sfgui.sfml-dev.de/forum/post316.html#p316)

binary1248 from here on

Code: [Select]
#include <SFML/Graphics.hpp>

class Singleton {
public:
  static void instanciate() {
    instance.reset( new Singleton );
  }
private:
  static std::tr1::shared_ptr<Singleton> instance;

  sf::Texture texture;
};

std::tr1::shared_ptr<Singleton> Singleton::instance;

int main() {
  Singleton::instanciate();

  return 0;
}


Compiles and crashes on Windows 7 VS2008.

To quote a famous TV personality I find entertaining: Well THERE'S your problem.

Explanation:

In Visual C++ static variables are destroyed in a specific order, this being different from GCC obviously. When trying to destroy the sf::Texture, SFML tries to make sure that it has a valid context to destroy it with. This in turn tries to create a new context because the previous one was already destroyed. When creating a context it has to lock a mutex (a.k.a enter a critical section on windows) which is declared static inside createContext. This requires the critical section to be initialized before use and destroyed after done using. Because the Mutex is static the critical section it contains was destroyed immediately at the end of the application and stupid MSVC can't realize it is trying to access a static variable it already destroyed and doesn't want to initialize a new one. EnterCriticalSection then crashes because it is passed an invalid CRITICAL_SECTION object.

There you go. Another case of MSVC and SFML playing nice together.

I'm too lazy to report this somewhere so if anybody could mention this or link Laurent to this post I would be very very grateful.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 13, 2012, 10:52:37 pm
I'd say that doing things outside main(), especially things that involve resources & other APIs, is very dangerous and should be avoided.
Don't replicate the same mistakes as SFML, write better code :P

Seriously, I'd be glad to remove all the global stuff that I use, but unfortunately I haven't found a clean replacement yet.

Quote
Another case of MSVC and SFML playing nice together

This behaviour conforms to the standard, as far as I know. The order of destruction of global objects is undefined across translation units, and using a destroyed object is undefined too.
Title: sf::Texture crashes when used in a static object
Post by: binary1248 on March 14, 2012, 12:41:10 am
And what makes it worse is the fact that objects from different translation units and possibly different executable modules rely on each others existence to function properly.

I normally avoid using static variables because they are so... unpredictable... but because I needed a singleton object there was no other way. That being said I guess singletons aren't compatible with the way SFML handles it's GL contexts?

If you ask me, the true problem here is tracking the dependencies of static objects among each other. Because their destruction order is undefined they have to be locked from spontaneous destruction much like how the context is now (waiting for the last resource to be destroyed).

One thing has puzzled me since I figured this out: Why does the context even get destroyed before the sf::Texture does? The whole problem is that a new context is being created after the application has exited anyway just so that the sf::Texture, which was created with the previous context, can be destroyed? It seems a bit strange.

Perhaps it is because faulty mutex locking is also causing the reference counting in GLResource to break? Either way I think the sf::Mutex class has to be made more robust to support static instances better.

EDIT:

After a bit more research I found these useful links:
http://blogs.msdn.com/b/oldnewthing/archive/2004/03/08/85901.aspx
http://www.parashift.com/c++-faq-lite/ctors.html Articles 10.14 to 10.18 are of interest.

Now hopefully anybody reading this realizes that playing with static variables is literally deadly. That said, having a static Mutex in WglContext.cpp is like asking for a fully loaded revolver in Russian roulette.
Title: sf::Texture crashes when used in a static object
Post by: Lee R on March 14, 2012, 01:19:42 am
Quote from: "binary1248"
[...]but because I needed a singleton object there was no other way.[...]


There is always another way ;) Use the service locator pattern instead.
Title: sf::Texture crashes when used in a static object
Post by: binary1248 on March 14, 2012, 01:28:24 am
Quote from: "Lee R"
Quote from: "binary1248"
[...]but because I needed a singleton object there was no other way.[...]


There is always another way ;) Use the service locator pattern instead.


I'm always up for something new. The question is: will Tank give the green light? :P

Even more research:

According to the reply in this http://stackoverflow.com/questions/3143180/how-to-do-static-de-initialization-if-the-destructor-has-side-effects-and-the-ob, compilers reinitialize static variables in the reverse of how they were initialized. Logic tells me that because the Mutex in WglContext.cpp is only constructed after the WglContext itself is that that static Mutex is also destroyed before the WglContext is. Upon creation of another WglContext it isn't recreated because it was already created at the start of the application and therefore just ignored?

There must be a subtle difference between how GCC and MSVC handles static variables and there are 2 possible scenarios. Either both don't conform with the standard or one does and the other doesn't. Either way I think GCC is being smarter about static vars than MSVC.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 08:14:10 am
Quote
I normally avoid using static variables because they are so... unpredictable... but because I needed a singleton object there was no other way

Can't you call a cleanup function at the end of the main()? You're not forced to let the instance be destroyed automatically after main(), you can kill it manually and force a predictable and correct behaviour.

Quote
One thing has puzzled me since I figured this out: Why does the context even get destroyed before the sf::Texture does?

I have to look at my code again, but I don't think it's the expected behaviour. But it would be the same even if the context was still there: it would have to be destroyed after that, and destroying a context involves the same stuff (mutex) as creating one.

Quote
That said, having a static Mutex in WglContext.cpp is like asking for a fully loaded revolver in Russian roulette.

Only if you do dangerous things in your own code ;)
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 08:39:26 am
Quote
I'm always up for something new. The question is: will Tank give the green light?

Fixes get instant green light. ;)

Quote
fully loaded revolver in Russian roulette.

ymmd

To me it seems both sf::Context and sfg::Renderer are show-stoppers here. I know that the automatic construction of both is to ease the usage of both libraries, but if that leads to problems, I think it's really not worth it.

SFML uses a specific class (sf::Window and derivates) that everyone needs to be able to render stuff. Therefore managing the context yourself would be quite easy I guess:

Code: [Select]
int main() {
  sf::Context context;
  sf::RenderWindow( ..., context );
}


That way it's guaranteed that the context dies after everything else with just one more line of code. If you really want dangerous behaviour, you could provide another ctor that doesn't need a context reference and constructs the static context (just like it's done now), but if you asked me it's better that people write that extra line instead of wanting to be as easy as possible.

Of course the example implies that it's even possible to work with an sf::Context reference, I haven't checked SFML's sources.

As for SFGUI I don't have an idea yet; the simplest would be init() and cleanup(), but that's often not needed in C++ and is also annoying.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 09:01:05 am
Quote
SFML uses a specific class (sf::Window and derivates) that everyone needs to be able to render stuff. Therefore managing the context yourself would be quite easy I guess

That wouldn't make things better for people who destroy sf::Textures after the main() ;)
And for people who don't, everything already works fine with the current code.
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 11:01:48 am
Doh (http://www.youtube.com/watch?v=khSIYmTzt6U), that's true. ;)
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 01:20:30 pm
Alright, did some brainstorming with binary1248 and came up with one possible solution.

Diagram first:
(http://www.abload.de/img/proxy5ea74.png)

Let's take sf::Texture as an example:
An sf::Texture object is created and you call loadFromImage() on it. sf::Texture then calls sf::Context::create_resource() to create the appropriate texture resource and gets back a proxy object (or even better: turn sf::Context into a factory that creates objects like sf::Texture).

The resource proxy holds a weak pointer to the real resource that's being owned by sf::Context. That means that when the Context is getting destroyed, all resources will be destroyed to and the proxies reference to nullptr (sf::Texture has to check for this case, of course).

If sf::Texture gets destroyed before sf::Context, then ResourceProxy's destructor will unregister the resource at sf::Context.

This can be done simply using shared_ptr/weak_ptr or an own implementation of reference counting and especially weak references to detect when a resource got invalid.

Another option (which should be considered indepedently from the crash bug, too!) is removing the shared context stuff, which would also remove the need to wait for static resources being destroyed. Instead of having a shared contexts, sf::Window should completely own and manage one.

I don't know all the benefits, but also for having multiple windows I don't think it's worth all the trouble.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 01:58:24 pm
All I can say is that it is a very complicated issue. Probably the most complicated one in SFML. There are many things to consider, and solutions that might appear simple are generally not covering all the use cases.

To tell you what's good and what's wrong with your solution, I'd have to dive in my code again, and I'm afraid I won't have the time to do that right now. What I suggest is that you create an issue on the tracker, that we continue to put as many details and ideas there, and I'll get back to it seriously after SFML 2.0 is released.
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 02:23:41 pm
The issue exists since sf::Font::GetDefaultFont(). Imho it's not a good idea to release a 2.0 when knowing something like that exists.

Implementing resource control is surely something not so trivial and takes some time, but dropping shared contexts could be done quite easily. Yes, it would remove the possibility of creating multiple windows, but seriously, who *really* needs that? Also it won't be possible to use the context in separate threads, but that's something everybody should avoid *always* (it isn't compatible between all platforms and often/always leads to strange behaviour/crashes).

So until you've got more time on hands for SFML, can't you just disable shared contexts for SFML 2.0? It would at least solve a lot of problems in a lot of places (SFGUI & FlexWorld, just to name two bigger projects who ran into that issue quite early).

And while we're at it: Didn't you want to think about sf::Font::getDefaultFont() returning a copy of a new sf::Font instead of a static version?
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 02:32:58 pm
How do you load textures in a separate thread if I remove shared contexts?

Quote
Imho it's not a good idea to release a 2.0 when knowing something like that exists.

Hopefully this can be fixed without changing the public API, so it doesn't block me from releasing 2.0.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 02:38:44 pm
And multiple windows is an important feature that many people use, I don't want to remove it.
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 02:44:07 pm
Quote
How do you load textures in a separate thread if I remove shared contexts?

You don't. Believe me I ran into so many problems regarding exactly that, that I decided to drop the idea of using a context in another thread than the "primary" thread.

What you can do is loading the image data in a separate thread. But texture creation should still be done in one single thread.

Also think about how the interaction with the driver/GPU works: There's one bus which isn't threaded by any means. That means you don't have a benefit at all when you're preparing your textures in a separate thread.

Quote
And multiple windows is an important feature that many people use, I don't want to remove it.

Out of curiosity: Who? I really can't think of a single use-case where multiple windows couldn't be emulated by OpenGL.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 03:03:59 pm
Quote
Believe me I ran into so many problems regarding exactly that, that I decided to drop the idea of using a context in another thread than the "primary" thread.

The behaviour when using contexts in multiple threads is well defined, and as far as I can tell, it has always worked perfectly so far. Many users do that.

Quote
What you can do is loading the image data in a separate thread.

That would add unnecessary complexity and verbosity to the user code.

Quote
Also think about how the interaction with the driver/GPU works: There's one bus which isn't threaded by any means. That means you don't have a benefit at all when you're preparing your textures in a separate thread.

If you have 1 texture it's pretty useless, for sure. But if you want to load 1000 textures, you'll be happy to be able to run your game loop smoothly while the textures are being loaded in the background.

Quote
Out of curiosity: Who? I really can't think of a single use-case where multiple windows couldn't be emulated by OpenGL.

I guess you'll have to search the related posts, because I can't remember them :P
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 03:19:46 pm
Quote
The behaviour when using contexts in multiple threads is well defined, and as far as I can tell, it has always worked perfectly so far. Many users do that.

Is that an assumption or a fact? If it's a fact, then I'm probably much more worse than I thought, because I wasn't able to do that with SFML both on Windows and Linux with Nvidia and AMD graphic cards in several configurations.

Some crashed, some did not display anything. And I did minimal examples that did nothing else than loading a texture in a separate thread with context switching -- also protected by a mutex. It DOES lead to problems.

And again, it does not help you with "smooth" texture loading. The texture does go through the same bus, there's not a special separated texture bus that's being used.

Quote
That would add unnecessary complexity and verbosity to the user code.

It adds complexity, yes. But it's not unnecessary, it's needed to make it work flawlessly. And besides of that, it's not complicated (complicated != complexity). In a separate thread you call prepareTexture() or whatever, and in the main thread you finalize them, as easy as pie.

Quote
If you have 1 texture it's pretty useless, for sure. But if you want to load 1000 textures, you'll be happy to be able to run your game loop smoothly while the textures are being loaded in the background.

As soon as the bus is used, you lose every benefit. I really know that, but you can do an example to prove the other way: Load 1000 textures in one context and 1000 textures in another context (or even thread, doesn't matter). FPS will drop in both cases, because the bus is used equally. And make sure not to count the time in that's needed for sf::Image::loadFromFile(), that's not related to textures in any way.

Quote
I guess you'll have to search the related posts, because I can't remember them

Probably because there're just too less or unimportant. ;-) So if anyone is reading this and needs multiple windows, please scream.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 03:35:39 pm
Quote
Is that an assumption or a fact?

It's a fact. Regular use cases work fine; you might run into troubles only if you do complicated things (but in this case it's probably not SFML which is causing the problems), or you use crappy graphics drivers.
I'd be very interested to see your most minimal/simple example that doesn't work as expected.

Quote
It adds complexity, yes. But it's not unnecessary, it's needed to make it work flawlessly.

It already works flawlessly ;)
And like I said, many users already do it, so it would be silly to change that.

Quote
And besides of that, it's not complicated (complicated != complexity). In a separate thread you call prepareTexture() or whatever, and in the main thread you finalize them, as easy as pie.

I think it's already complicated for what it is -- a little more than what you said: you have to store all the intermediate images, consuming unnecessary memory, and map them to some unique ID (which may not be a filename) so that you can assign them to their matching sf::Texture.
But anyway, let's talk about this later, only if we agree to go in this direction ;)

Quote
As soon as the bus is used, you lose every benefit

All that I'm saying is that it's much easier to scatter the loading of 1000 textures among multiple frames if it's done in a parallel thread, than if it's done in the same one. And avoid a big freeze of the application when you do it.

Quote
And make sure not to count the time in that's needed for sf::Image::loadFromFile(), that's not related to textures in any way.

Ok, but it adds even more complexity to your solution. We're still comparing the global task of preloading textures smoothly ;)

Quote
Probably because there're just too less or unimportant.

Seriously, am I the kind of guy to implement such a big feature if it is useless? :twisted:
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 03:40:15 pm
By the way... can you remind me how disabling shared contexts and multiple windows would help to solve the initial problem? Why are we discussing these details?
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 04:20:17 pm
Quote
but in this case it's probably not SFML which is causing the problems

That's right, it's generally multi-threading and shared contexts with OpenGL.

Here are two very interesting links:

- http://www.opengl.org/wiki/OpenGL_and_multithreading
- http://developer.amd.com/gpu_assets/GDC2005_OpenGL_Performance.pdf

In the PDF start at slice 33.

Quote
I'd be very interested to see your most minimal/simple example that doesn't work as expected.

Here you go: http://pastebin.com/bMeGSRpa

Keep in mind what I said before: It doesn't work for some OS/GPU combinations only! I hadn't got any problems, it were 2 out of 12 users who had problems when I loaded textures in parallel (both Win7 64 bit, AMD GPU, latest drivers at that time).

Quote
you have to store all the intermediate images, consuming unnecessary memory, and map them to some unique ID (which may not be a filename) so that you can assign them to their matching sf::Texture.

At first, you do not consume unnecessary memory: You also consume the same amount of memory when doing loadFromFile(), except for a shorter time. When loading images in parallel and finalizing the textures in an update() step, yes, memory is used for a larger time – some milliseconds maybe. That's definitely not an issue.

Also you do not need to map anything, the details are hidden in your class design. In FlexWorld for example as soon as a texture is requested, a pointer is held to the same object – if it has the texture initialized already or not. When it gets ready, it's used. If not, then you see just nothing. But as this happens only for a couple of milliseconds, it's not noticable.

That of course depends on your OOP skill. If you use only SFML's classes, then yes, you have to keep mappings somewhere. But for everything else than Pacman or other smaller projects you'll likely have some sort of resource management externally that takes care of such things.

Quote
And avoid a big freeze of the application when you do it.

Again: You can't avoid freezes that happen due to texture preparation (except you're using pixel buffers, but that's another topic). Freezes are mostly the result of time spent both on the hard-disk and decompression. So if that's the bottleneck, I have to avoid it – not loading textures to the GPU which can't be made faster.

Quote
Seriously, am I the kind of guy to implement such a big feature if it is useless?

Normally not, this is why I'm surprised you want to keep it, even if it means having issues. ;-) Seriously, I just searched the forums for "multiple windows" and found some threads. Not a single thread revealed what they're used for except "Can I do it?".

Besides of that: Multiple windows are of course still possible, the critical part is the shared context. If you have two isolated contexts for each window, everything's fine, no issues. So I rather ask to drop the shared context, sorry for being unclear.

Quote
By the way... can you remind me how disabling shared contexts and multiple windows would help to solve the initial problem? Why are we discussing these details?

I rechecked a recent stacktrace of an SFGUI crash. Take a look at it:

Code: [Select]
ntdll.dll!77d9a2ce()    
    [Frames below may be incorrect and/or missing, no symbols loaded for ntdll.dll]    
    Label.exe!sf::priv::MutexImpl::Lock()  Line 52 + 0xc bytes    C++
    Label.exe!sf::Mutex::Lock()  Line 62    C++
    Label.exe!sf::Lock::Lock(sf::Mutex & mutex)  Line 39    C++
    Label.exe!sf::priv::WglContext::CreateContext(sf::priv::WglContext * shared, unsigned int bitsPerPixel, const sf::ContextSettings & settings)  Line 315 + 0x10 bytes    C++
    Label.exe!sf::priv::WglContext::WglContext(sf::priv::WglContext * shared)  Line 59    C++
    Label.exe!sf::priv::GlContext::New()  Line 142 + 0x28 bytes    C++
    Label.exe!`anonymous namespace'::GetInternalContext()  Line 87 + 0x5 bytes    C++
    Label.exe!sf::priv::GlContext::EnsureContext()  Line 135 + 0x7 bytes    C++
    Label.exe!sf::GlResource::EnsureGlContext()  Line 83    C++
    Label.exe!sf::Texture::~Texture()  Line 98    C++
    Label.exe!sfg::Renderer::~Renderer()  Line 45 + 0x36 bytes    C++
    Label.exe!sfg::Renderer::`scalar deleting destructor'()  + 0x16 bytes    C++
    Label.exe!sfg::ReferenceCount<sfg::Renderer>::Dispose()  Line 61 + 0x1f bytes    C++
    Label.exe!sfg::ReferenceCountBase::RemoveStrongReference()  Line 34 + 0xf bytes    C++
    Label.exe!sfg::StrongReferenceCount::~StrongReferenceCount()  Line 84    C++
    Label.exe!sfg::SharedPtr<sfg::Renderer>::~SharedPtr<sfg::Renderer>()  + 0x19 bytes    C++
    Label.exe!`dynamic atexit destructor for 'sfg::Renderer::m_instance''()  + 0xd bytes    C++
    Label.exe!doexit(int code, int quick, int retcaller)  Line 567    C
    Label.exe!exit(int code)  Line 393 + 0xd bytes    C
    Label.exe!__tmainCRTStartup()  Line 284    C
    Label.exe!mainCRTStartup()  Line 189    C
    kernel32.dll!76cb339a()    
    ntdll.dll!77d79ef2()    
    ntdll.dll!77d79ec5()


You could of course also try to workaround resource destruction by checking if there's a valid context and if not skip the cleanup (Texture::~Texture). But imho the context management is somewhat messed up. ;)
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 04:53:37 pm
Quote
That's right, it's generally multi-threading and shared contexts with OpenGL

... with crappy drivers. The specification of OpenGL itself is very clear about multi-threading.
I insist that problems are caused by the drivers, not by the programer, its design choices or the APIs that it uses.

Quote
Here you go: http://pastebin.com/bMeGSRpa

Really? The thread instance is local to the if{} block, therefore the main thread blocks until it's finished ;)
And you don't need to explicitely create sf::Context instances anymore in SFML 2, you can remove it from your load_texture function.

And what were the errors with this code? Crash, or wrong results?

Quote
At first, you do not consume unnecessary memory: You also consume the same amount of memory when doing loadFromFile(), except for a shorter time. When loading images in parallel and finalizing the textures in an update() step, yes, memory is used for a larger time – some milliseconds maybe. That's definitely not an issue.

It could be an issue. In the threaded version, images are loaded in memory one after the other. With your solution, it's all at the same time.
But I agree that in the real world it's unlikely to be a problem, since there will always be more system RAM than video RAM.

Quote
In FlexWorld for example as soon as a texture is requested, a pointer is held to the same object – if it has the texture initialized already or not. When it gets ready, it's used. If not, then you see just nothing. But as this happens only for a couple of milliseconds, it's not noticable.

So you had to write a dedicated class just for that?

Quote
You can't avoid freezes that happen due to texture preparation

I prefer to have 1000 1ms freezes, rather than a single 1sec freeze.

Quote
Normally not, this is why I'm surprised you want to keep it, even if it means having issues.

The feature of allowing multiple windows itself doesn't create new issues. It's context sharing which does, which is an orthogonal concept (I can disable it while keeping multiple windows).
(ok, I just saw that you say it below)

Quote
Seriously, I just searched the forums for "multiple windows" and found some threads. Not a single thread revealed what they're used for except "Can I do it?"

Arf, I guess I'll have to spend some time to find something relevant then.

Quote
I rechecked a recent stacktrace of an SFGUI crash. Take a look at it

It would crash without the sharing thing. And even if I totally remove my (totally) messy context management, creating a context after main() still crashes on ATI drivers, so the only thing that would really solve the problem is to make sure that everything's destroyed before main() returns. So, basically, adding a sf::cleanup() function or similar. But then I would have other issues (in the .Net binding for example).
Title: sf::Texture crashes when used in a static object
Post by: binary1248 on March 14, 2012, 05:15:45 pm
Quote from: "Tank"
Quote
And avoid a big freeze of the application when you do it.

Again: You can't avoid freezes that happen due to texture preparation (except you're using pixel buffers, but that's another topic). Freezes are mostly the result of time spent both on the hard-disk and decompression. So if that's the bottleneck, I have to avoid it – not loading textures to the GPU which can't be made faster.


Quote
I prefer to have 1000 1ms freezes, rather than a single 1sec freeze.


Not to mention, nobody said multiple threads would all get the same amount of execution time from the OS scheduler. Sure the ideal would be a perfect fair scheduling strategy, but as we all know operating systems have become so complex nowadays and scheduling is a science for itself you really can't count on multiple threads in the same process having the same priority.

Take for example you try to play your favorite game at maximum detail to fully load the system. You then start to compress something that takes a while and also uses all CPU cores. You would expect the FPS in the game to drop and stay stable at a certain reduced value while the compression takes place. What you instead observe is that it fluctuates noticeably, maybe even producing a bit of stutter in the game for a while. Sure this example is about scheduling across multiple processes but the same would count for threads within a single process too.

Quote
It would crash without the sharing thing. And even if I totally remove my (totally) messy context management, creating a context after main() still crashes on ATI drivers, so the only thing that would really solve the problem is to make sure that everything's destroyed before main() returns. So, basically, adding a sf::cleanup() function or similar. But then I would have other issues (in the .Net binding for example).


The problem now is that the contexts are being played with even after main() returns (this includes even creating a new one). The context(s) can be created either before or preferably during execution of main() and should only be created when absolutely necessary (in almost all cases when the first GL resource is created). Constructing a new context after main() returns is just asking for trouble and is the root of all the issues being discussed here. That in turn, according to our understanding, is due to the management of these shared contexts going wrong somewhere. Microsoft and Khronos might give the illusion that using shared contexts is well defined, as well as the whole deinitialization order thing. But in reality we all know that you can't rely purely on standards to get something working.

Sure you can wait for AMD to fix their driver or Microsoft to come up with a new way to do old things (which is never going to happen, they hate OpenGL don't they?), but what if AMD doesn't want to fix this issue? Are you going to let programs crash forever or proactively look for a solution (workaround) around this known "behavior"?
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 06:16:21 pm
Quote
Not to mention, nobody said multiple threads would all get the same amount of execution time from the OS scheduler. Sure the ideal would be a perfect fair scheduling strategy, but as we all know operating systems have become so complex nowadays and scheduling is a science for itself you really can't count on multiple threads in the same process having the same priority.

Yeah, but basically if you set your loading thread to a low priority you can expect it to run without disturbing the main thread.

What's the point of all these technical discussions? Loading textures in threads works as expected, many users do it and are happy with it. All other solutions are more or less worse, in my opinion. And there's nothing to fix there, so where do we go? ;)

Quote
proactively look for a solution

I constantly do, and sometimes I find a solution that improves the global behaviour. But I haven't found the perfect solution yet, so I'll continue to work on it, yes.
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 06:44:49 pm
Quote
Really? The thread instance is local to the if{} block, therefore the main thread blocks until it's finished

:) I wrote it up from memory without compiling it. You get the idea.

Quote
And what were the errors with this code? Crash, or wrong results?

Both. One user reported not seeing anything with those loaded textures (and that one was really hard to tackle; a glDisable( GL_TEXTURE_2D ) revealed the *bug*), the other one experienced crashes in the driver.

Quote
It could be an issue. In the threaded version, images are loaded in memory one after the other. With your solution, it's all at the same time.

The amount of memory is still the same. I don't think you'll run out of memory just because you're fetching some images in one cycle.

Quote
So you had to write a dedicated class just for that?

Yes, better speaking of a proxying class for managing resources in general (to provide a garbage collection and loading stuff securely from separate threads, like textures, but also 3D models which are being uploaded into VBOs).

Quote
creating a context after main() still crashes on ATI drivers, so the only thing that would really solve the problem is to make sure that everything's destroyed before main() returns. So, basically, adding a sf::cleanup() function or similar. But then I would have other issues (in the .Net binding for example).

cleanup() isn't necessary imho, look at the proxy pattern I mentioned some posts before (it's just one possible solution of course, but it makes resource management automatic and you don't need ugly init/cleanup functions).

What kind of issues do you get in the .NET binding?

Quote
What's the point of all these technical discussions? Loading textures in threads works as expected, many users do it and are happy with it.

I think it points out some flaws. I already told you 2-3 times that loading textures in threads does not always work. Many users may do it, but how many of those distribute their software within a larger scale? Then what percentage of them will report the error instead of trying something else?

I don't report the crash or multi-threading issues for fun, they're real show-stoppers at my side. I can avoid some of them, primarily those which are under my responsibility.

You know the problems (and some solutions), so I guess it's up to you to decide if you're going to fix the problems or live with crashes on some machines. I can still add a "cleanup()" workaround to make everything work. ;-)
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 07:14:52 pm
Quote
I wrote it up from memory without compiling it. You get the idea.

Yeah, but I'd prefer to work on the exact code that produces the errors.

Quote
The amount of memory is still the same.

?
Anyway let's not waste too much time on this. This is a non-issue in real life.

Quote
cleanup() isn't necessary imho, look at the proxy pattern I mentioned some posts before

Yep, I'll definitely give it a try later. But I'm already pretty sure that it's not 100% perfect.

Quote
What kind of issues do you get in the .NET binding?

Some objects are destroyed by the garbage collector after the main(), and in a separate thread (the main thread has already terminated!). And I have absolutely no control on this behaviour.

Quote
I already told you 2-3 times that loading textures in threads does not always work

Since it globally works for me, I prefer to consider that as bugs that need to be fixed, rather than a global technical issue that must be worked around. So I'll first test your small code, probably asking other people to test it as well, and see what we get from there :)

Quote
it's up to you to decide if you're going to fix the problems or live with crashes on some machines

Of course I'll try to fix things. It's just that it's complicated, and takes time.

Quote
I can still add a "cleanup()" workaround to make everything work.

Yeah, you should try to find a workaround on your side, because:
- nobody knows when I'll have the time to work on this issue and find a solution
- the solutions that you find for your own code might help me ;)
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 14, 2012, 07:25:45 pm
Quote
the solutions that you find for your own code might help me

I see. ;)

Quote
Yeah, but I'd prefer to work on the exact code that produces the errors.

Just put the thread variable in a more persistent scope, that's it.

Thanks for taking the time.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 08:50:41 pm
I've found a potential issue with this code in SFML, the fix is easy but I need to check something first. After that I'll create a new topic to make people test :)

By the way, I totally forgot about this project: sfeMovie (https://github.com/Yalir/sfeMovie)

It shows two interesting things:
- loading textures in a separate thread is totally mandatory in this use case
- it seems to work fine too, however I don't know how many people use it
Title: sf::Texture crashes when used in a static object
Post by: binary1248 on March 14, 2012, 09:14:09 pm
Quote from: "Laurent"

It shows two interesting things:
- loading textures in a separate thread is totally mandatory in this use case
- it seems to work fine too, however I don't know how many people use it


When loading the texture in a separate thread, I agree with you that it would be wise to do all the CPU driven tasks in that thread. But ultimately when you call glTexSubImage2D or any similar GL function the driver will queue the command in the same queue as if you were making all GL calls from a single thread.

The question is: who does the concurrency management? The driver which as you know is buggy as hell and Wgl which is so awesomely documented, or you yourself who can make sure that everything gets done as you expect and no overhead is needed where it isn't necessary.

I can't imagine this to be too hard of a task. Simply wrap your context in a class that interfaces to OpenGL and lock the context with a mutex when a thread wants to perform some GL magic. As it is now I can imagine the same happening just not in SFML but in the operating system somewhere.

In the case of sfeMovie I personally would have made use of some advanced GL buffer objects instead of constantly updating a texture all the time, but that's just my preference.
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 14, 2012, 10:56:24 pm
Since OpenGL contexts are thread-local resources, no concurrency can happen until the calls are processed by the driver, right before being sent to the graphics card. So I don't know what you mean with "[...] lock the context with a mutex when a thread wants to perform some GL magic", can you elaborate on this?

Quote
In the case of sfeMovie I personally would have made use of some advanced GL buffer objects

For sure it would be better with PBOs, but I don't know if the different would be that significant.

Quote
instead of constantly updating a texture all the time, but that's just my preference.

Whatever solution you choose, you have to constantly upload the pixel data to a texture anyway. Or am I missing an awesome OpenGL technique? :D
Title: sf::Texture crashes when used in a static object
Post by: binary1248 on March 15, 2012, 12:44:31 am
Quote from: "Laurent"
Since OpenGL contexts are thread-local resources, no concurrency can happen until the calls are processed by the driver, right before being sent to the graphics card. So I don't know what you mean with "[...] lock the context with a mutex when a thread wants to perform some GL magic", can you elaborate on this?


The idea is to only have a single context in your application and do the synchronization between threads yourself. You would have to make sure that only one thread has the context set as active at a time therefore lock a mutex every time you want to make it current somewhere. After the GL call is done you can make it non-current and release the lock. This would be the model multiple-thread-single-context. As shown in the PDF Tank linked to, single-thread-single-context is also possible and even recommended by many professionals. In this case you would queue your GL commands for a single "GL worker thread" to process. Due to the fact that a queue is FIFO it has implicit fairness built into it and no worries have to be made over one thread starving another thread of execution time. This is probably also how commands are queued and sent to the GPU. Since this solution is probably a bit more complicated the first one would be a nice middle ground between what SFML uses now and the optimal case.

Quote from: "Laurent"
Whatever solution you choose, you have to constantly upload the pixel data to a texture anyway. Or am I missing an awesome OpenGL technique? :D


Once again Mr. Song has written an excellent article about this matter: http://www.songho.ca/opengl/gl_pbo.html

Note also I was referring to the SFML sf::Texture and not the OpenGL texture object.
Title: sf::Texture crashes when used in a static object
Post by: Nexus on March 15, 2012, 05:43:55 pm
Tank, I have already used multiple windows. And I also plan to use resource loading in a separate thread, exactly because of the smooth continuation of the main thread Laurent mentioned.

And I don't consider it a good idea to remove useful SFML features to make a questionable design (singletons) in your own project working. Are you sure you find no other way? You already mentioned init() and cleanup() should work, what about a RAII object encapsulating them? I think a central SFGUI object instantiated once in main() is acceptable, especially if it covers more functionality like the core of your GUI hierarchy or whatever. And it's definitely better than an ever-present bug at program exit.
Title: sf::Texture crashes when used in a static object
Post by: Tank on March 15, 2012, 07:11:29 pm
Quote
I have already used multiple windows

Mind telling us for what?

Quote
And I also plan to use resource loading in a separate thread, exactly because of the smooth continuation of the main thread Laurent mentioned.

I really feel like talking with walls here. ;-) You can still load your resources in a separate thread, I already gave an example how to do it. And finally, like stated a million times already: You do not benefit from creating the textures in a separate thread. It's only loading and decompressing image data that do freezes/FPS drops.

Just answer the following question: If creating textures in a separate thread means that your game crashes at 1% of your user's computers, would you consider choosing another option or "just keep it that way"?

Please read the articles I linked in a previous post, or even do a Google search on OpenGL multi-threading and shared contexts. It's everything but recommended for good reasons.

Quote
And I don't consider it a good idea to remove useful SFML features to make a questionable design (singletons) in your own project working.

I already stated that "removing multiple windows" is not what I planned to achieve, but rather removing shared contexts. You can use as much windows as you like and you won't run into any problems when they all have separate contexts. But when they're shared, you have to pay attention in many regards that you better do with a single context in your application. In my opinion there's no advantage of using shared contexts.

Did you take a look at SFGUI's renderer to see why it's a singleton? It's a singleton because it's a global interface to OpenGL. You could also write free functions for most of the features the renderer provides. However it also manages some resources like texture caches and OGL handles, that's why it's an object primarily.

I really don't think having a singleton like that is an issue at all. The problem is that resources are not managed by SFML's contexts, so we can run into unpredicted behaviour quite easily. It can and should be avoided in my eyes.

And as you're talking about RAII: The one for SFML's resources doesn't pay attention at the context's scope. So when you argue that SFGUI should carry an init()/cleanup() pair, then SFML must have that too, because it's possible to keep resources alive when the parent object gets destroyed.

Quote
And it's definitely better than an ever-present bug at program exit.

Yeah, but now SFGUI users have to do an ugly

Code: [Select]
sfg::SFGUI foobar;

to workaround crashes – but not all, the GetDefaultFont() crash is still there and you can't do anything against it (except linking statically which seems to "fix" it, but I call that a fix that randomly works).
Title: sf::Texture crashes when used in a static object
Post by: Nexus on March 16, 2012, 06:47:06 pm
I have used a window as a splash screen while a game was loading. When the actual game window is opened, the splash screen remains for a short time to show the loading success. I know it's rather a specific use case, but you wanted people to scream if they have already used it ;)

You say we can load resources in other threads, but we shouldn't create textures there. But then you would have to split and complicate logic (other thread loads pixels, main thread checks if already loaded, and creates texture if ready). Are you sure the time needed to load textures isn't remarkable?

And I agree that the RAII object isn't very beautiful, I just mentioned it because you said technically it would work with cleanup()/init() -- that is, it's still more beautiful than the C way ;)
Title: sf::Texture crashes when used in a static object
Post by: eXpl0it3r on March 17, 2012, 01:52:27 am
Quote from: "Laurent"
I've found a potential issue with this code in SFML, the fix is easy but I need to check something first. After that I'll create a new topic to make people test :)


Is that already somewhere available?
I'm the one with the crappy GPU and have thus experienced the problem in creating textures in a seperate thread. Also I've helped Tank figure it out over a 1-2h 'remote debug sessions'. :lol:

Also consider this about 'crappy' drivers: HP and other vendors do not let you use the native graphics driver but instead provide their own tweaked one. So if AMD puts out a new shiny driver on which every thinkable problem would be solved many many many users won't be able to use it (until the ones with the upper hand do push out their own driver).
Had to learn that the hard way (= had to reinstall the whole Windows).

Btw I find this post interessting and amusing at once, keep it up! :P
No seriously I like that such problems are being adressed and discussed openly. I know it's already enough time consuming to maintain and advance a complete crossplatform multimedia library next to a (full time) job (respect for that!) but since I guess we have some really smart guys on SFML board it would be also nice to see some other indepth topics on SFML and the stuff that is coming/that is worked on at the moment, rather than keeping everything to oneself and reveale it at the end. :wink:


Too tired to spellcheck my English... :wink:
Title: sf::Texture crashes when used in a static object
Post by: Laurent on March 17, 2012, 11:25:57 am
Quote
Is that already somewhere available?

It's more complicated than I thought :)

You can test this code: http://pastebin.com/axEarezn

(both in debug and release -- I've seen differences between them)
(you must link to OpenLG library)