Ok after some tests there seems to be something strange with some of the deconstructors and the OpenGL contexts. Lets start with some sample code.
This first example causes a memory leak. I also included the Textures decontructor to show which descontructor is being called.
static void Main
(string[] args
) { for (int c
= 0; c
< 10000; c
++) { SFML
.Graphics.Texture text
= new SFML
.Graphics.Texture(@"...file path..."); } } ~ObjectBase
() { Dispose
(false); }
Now here is an example that does not have a memory leak. And the decontructor for reference.
static void Main
(string[] args
) { for (int c
= 0; c
< 10000; c
++) { SFML
.Graphics.Texture text
= new SFML
.Graphics.Texture(@"C:\Users\Zachariah Brown\Pictures\96134.jpg"); text
.Dispose(); } } public void Dispose
() { Dispose
(true); GC
.SuppressFinalize(this); }
This code has the same memory leak regardless if you use a Texture, Font, or RenderTexture. The difference between the two decontructors is the Dispose(bool) call. Now when we break into that Dispose call we will see the following.
private void Dispose(bool disposing)
{
if (myCPointer != IntPtr.Zero)
{
Destroy(disposing);
myCPointer = IntPtr.Zero;
}
}
protected abstract void Destroy(bool disposing);
The memory leak happens in the implementation of the Destroy(bool) call. Lets look at the Textures's Destroy implementation.
protected override void Destroy(bool disposing)
{
if (!myExternal)
{
if (!disposing)
Context.Global.SetActive(true);
sfTexture_destroy(CPointer);
if (!disposing)
Context.Global.SetActive(false);
}
}
Look at that Context.Global.SetActive(bool) call. Now if look back to the example that does not cause the memory link you will notice Destroy will be called with this.
Destroy(true);
Since Destroy is called with true that means we make 0 calls to the OpenGL context. The memory leak comes from calling SetActive(bool) on the OpenGL context. More specifically when Context.Global.SetActive(false); is called. As soon as you comment out the SetActive(bool) call the first example that originally leaked memory no longer leaks.
Now this brings out a few questions (aimed at Laurent).
- Why is SetActive(false); leaking memory? I took from this thread (http://en.sfml-dev.org/forums/index.php?topic=10309.0) that the OpenGL contexts are only supposed to leak memory when multiple threads come into play.
- Why are the deconstructors calling Dispose(bool) differently? Shouldn't the unmanaged objects be disposed in the same way regardless if they go out of scope or have the decontructor explicitly called?
On a side note Laurent, I also took a look at the way the .NET bindings handles contexts and all of the contexts use a single global context.
public static Context Global
{ get { if (ourGlobalContext
== null) ourGlobalContext
= new Context
(); return ourGlobalContext
; } }private static Context ourGlobalContext
= null;
So if the bindings only ever use 1 context, why is there even a need to call SetActive(bool) on the global context?
This is the standard way of doing it, look at the MSDN example.
I don't remember why it is done this way, but when I implemented it (many years ago!) it was very clear
Ok don't change the boolean value in the call, but I think you misunderstand the use of that boolean. Take a look at the following quote taken from here (http://stackoverflow.com/questions/538060/proper-use-of-the-idisposable-interface).
- disposing == true: the method has been called directly or indirectly by a user's code. Managed and unmanaged resources can be disposed.
- disposing == false: the method has been called by the runtime from inside the finalizer, and you should not reference other objects. Only unmanaged resources can be disposed.
The value is simply to determine if you should also dispose of managed objects. Unmanaged objects (raw pointers [Intptr]) are always do be disposed of in the same way. What the SFML .NET bindings do is read that disposing value and then if it is false call SetActive(true) and SetActive(false). The disposing value will be false if the managed object's default constructor is called.
~ObjectBase()
{
Dispose(false);
}
The question is Laurent, why are you making calls to the OpenGL context when the garbage collector (same as when the object goes out of scope) cleans up instead of the user (through an explicit Dispose call)? The OpenGL calls come from when you check the value of that disposing boolean and then if it is false you call SetActive(true) and SetActive(false).
Sorry I didn't read the full discussion; are you sure of that? How did you check? And where is the leak exactly? OpenGL drivers are known to cause leaks (either true or false positives, it depends).
In the example where the leak happens, I let the Texture go out of scope. This will then cause the default destructor to be called with Dispose(false). Then in the Destroy function if Dispose was called with a false value you call SetActive(false) and SetActive(true). See the Destroy function below. Now if I comment out the call to SetActive their is no longer a memory leak. I see the memory leak in task manager (I know...) the memory usage continues to go through the roof and eventually SFML crashes because it can't load the texture.
But lets say its my drivers that are leaking when SetActive is called - my point above still stands, why make the destructor different for unmanaged memory depending on where the memory cleanup is coming from?
protected override void Destroy(bool disposing)
{
if (!myExternal)
{
if (!disposing)
Context.Global.SetActive(true);
sfTexture_destroy(CPointer);
if (!disposing)
Context.Global.SetActive(false);
}
}
Either way we should make the destructor make the same calls to the OpenGL context.
Trust me, if it's done this way, it's not just to create a memory leak :P
Not to bloat this thread anymore, but I would love to learn more about OpenGL contexts. Especially what effect SetActive(bool) has on freeing resources.