I'm using SFML in a scenario where I need to spawn multiple windows (never simultaneously). SFML is being used inside a video player control inside a wider application and whenever a user goes to play a video, a new SFML window is created.
However, this seems to create a memory leak and I suspect it might be related to opengl contexts being created. I've attached both a C# and C++ version to demonstrate the issue.
private RenderWindow SFMLWindow
= null;private void RenderLoop
() { //SFMLWindow = new RenderWindow(new VideoMode(400, 300), "SFML doesn't work(s)"); while(true) { SFMLWindow
= new RenderWindow
(new VideoMode
(800,
600),
"hello"); SFML
.Graphics.Texture texture
= new Texture
(1920,
1080); SFML
.Graphics.Sprite sprite
= new Sprite
(texture
); SFMLWindow
.Draw(sprite
); SFMLWindow
.Display(); SFMLWindow
.Clear(); System.Threading.Thread.Sleep(1000); SFMLWindow
.Close(); SFMLWindow
.Dispose(); System.Threading.Thread.Sleep(1000); texture
.Dispose(); sprite
.Dispose(); } }
int _tmain(int argc, _TCHAR* argv[])
{
sf::RenderWindow window;
while (true)
{
window.create(sf::VideoMode(800, 600), "leak");
sf::Texture texture;
texture.create(1920, 1080);
sf::Sprite sprite(texture);
window.draw(sprite);
window.display();
Sleep(1000);
window.clear();
window.close();
Sleep(1000);
}
}
Strangely enough, running memory profile tools haven't detected a memory leak, but memory is definitely increasing and not dropping.
I ran the C++ test, with both sleep statements at 500ms, for 2 minutes.
Initially, memory (working set) was at 136MB. After 2 minutes, it had increased to 593MB. The texture size was 1200x600, or 2.74MB per second. The memory was increasing at a rate of 3.8MB per second.
Running the exact same test, except with the window.draw(sprite) line commented out, resulted in working set staying constant at about 42MB.
Hopefully the following code sample will hint at the issue:
int _tmain(int argc, _TCHAR* argv[])
{
sf::RenderWindow window;
while (true)
{
window.create(sf::VideoMode(600, 300), "SFML works! ");
sf::Texture texture;
texture.create(1200, 600);
sf::Sprite sprite(texture);
window.draw(sprite);
window.display();
Sleep(500);
sf::Texture smallTexture;
smallTexture.create(1,1);
sf::Sprite smallSprite(smallTexture);
window.clear();
window.draw(smallSprite);
window.display();
window.close();
Sleep(500);
}
}
This code produces no noticeable leak, given that the texture is so small.
However, what seems to be happening is that the memory allocated to the last texture drawn to the window is not being released for whatever reason. This hack would work for my purposes as nobody's going to notice such a small leak but it's an ugly hack regardless.
To put the final nail in the coffin, even this minimal code sample will cause memory usage to constantly rise:
#include <windows.h>
#include <GL/gl.h>
int main() {
auto window = CreateWindowA( "STATIC", "", WS_POPUP | WS_DISABLED, 0, 0, 1, 1, NULL, NULL, GetModuleHandle( NULL ), NULL );
ShowWindow( window, SW_HIDE );
auto deviceContext = GetDC( window );
PIXELFORMATDESCRIPTOR descriptor;
ZeroMemory( &descriptor, sizeof( descriptor ) );
descriptor.nSize = sizeof( descriptor );
descriptor.nVersion = 1;
descriptor.iLayerType = PFD_MAIN_PLANE;
descriptor.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
descriptor.iPixelType = PFD_TYPE_RGBA;
descriptor.cColorBits = 32;
descriptor.cDepthBits = 24;
descriptor.cStencilBits = 8;
descriptor.cAlphaBits = 0;
auto format = ChoosePixelFormat( deviceContext, &descriptor );
SetPixelFormat( deviceContext, format, &descriptor );
while( true ) {
auto context = wglCreateContext( deviceContext );
wglDeleteContext( context );
}
ReleaseDC( window, deviceContext );
DestroyWindow( window );
}
Since the loop only makes calls to wgl, it is wgl or the driver that is causing the leak for sure.
To put the final nail in the coffin, even this minimal code sample will cause memory usage to constantly rise:
#include <windows.h>
#include <GL/gl.h>
int main() {
auto window = CreateWindowA( "STATIC", "", WS_POPUP | WS_DISABLED, 0, 0, 1, 1, NULL, NULL, GetModuleHandle( NULL ), NULL );
ShowWindow( window, SW_HIDE );
auto deviceContext = GetDC( window );
PIXELFORMATDESCRIPTOR descriptor;
ZeroMemory( &descriptor, sizeof( descriptor ) );
descriptor.nSize = sizeof( descriptor );
descriptor.nVersion = 1;
descriptor.iLayerType = PFD_MAIN_PLANE;
descriptor.dwFlags = PFD_DRAW_TO_WINDOW | PFD_SUPPORT_OPENGL | PFD_DOUBLEBUFFER;
descriptor.iPixelType = PFD_TYPE_RGBA;
descriptor.cColorBits = 32;
descriptor.cDepthBits = 24;
descriptor.cStencilBits = 8;
descriptor.cAlphaBits = 0;
auto format = ChoosePixelFormat( deviceContext, &descriptor );
SetPixelFormat( deviceContext, format, &descriptor );
while( true ) {
auto context = wglCreateContext( deviceContext );
wglDeleteContext( context );
}
ReleaseDC( window, deviceContext );
DestroyWindow( window );
}
Since the loop only makes calls to wgl, it is wgl or the driver that is causing the leak for sure.
I tried this code sample on my home machine and like the previous, it didn't leak. Very concerning that it is a driver issue - almost certainly means that I'll have to use a different approach. Thanks for your help in shedding light on the culprit.