There is. Play with the private functions in GlContext.cpp and make public what you need.
I looked at it and it's an abstract base class with only a one private function, and two protected functions.
protected :
GlContext();
static int evaluateFormat(unsigned int bitsPerPixel, const ContextSettings& settings, int colorBits, int depthBits, int stencilBits, int antialiasing);
ContextSettings m_settings; ///< Creation settings of the context
private:
void initialize();
On Linux we implement GlxContext, so I opened the GlxContext .hpp and .cpp. I'm focussing on the destructor, since I'm using this testing code:
#ifdef DEBUG
#include <iostream>
#endif
#include <SFML/Graphics.hpp>
#include <thread>
int main()
{
// The drawer thread, active during the entire program's lifetime, constant drawing ////////////
std::thread([&]()
{
sf::RenderWindow wnd(sf::VideoMode(300, 200, 32), "Title");
wnd.setFramerateLimit(1);
while (true)
{
wnd.clear();
wnd.display();
}
}
).detach();
//////////////////////////////////////////////////////////////////////////////////////////////
// Simulation of switching states ////////////////////////////////////////////////////////////
while (true)
{
std::thread([&]()
{
sf::RenderTexture rt;
std::this_thread::sleep_for(std::chrono::seconds(1));
} // rt should be destroyed here, but its effect on the memory remains.
).join();
}
//////////////////////////////////////////////////////////////////////////////////////////////
return 0;
}
And whenever a window is active, the RenderTexture destructor is NOT properly called! Try to comment out the window thread, it'll not leak. With the window, it does leak! If we un-thread the state-switcher, it also does not leak. So I decided to take a closer look at the destructor:
GlxContext::~GlxContext()
{
// Destroy the context
if (m_context)
{
if (glXGetCurrentContext() == m_context)
glXMakeCurrent(m_display, None, NULL);
glXDestroyContext(m_display, m_context);
}
// Destroy the window if we own it
if (m_window && m_ownsWindow)
{
XDestroyWindow(m_display, m_window);
XFlush(m_display);
}
// Close the connection with the X server
CloseDisplay(m_display);
}
Seems alright, we then have code from the base class's virtual destructor:
GlContext::~GlContext()
{
// Deactivate the context before killing it, unless we're inside Cleanup()
if (sharedContext)
setActive(false);
}
An old foe? I showed that setActive causes memory expansion. Now even when the object is destroyed, memory is expanded. Does it have something to do with this? What about the functions in the GlxContext destructor? XFlush is ignored when the window is not owned? Am I even on the right tracks here?
Anyway, I removed the contents of ~GlContext(), however, I now get an error when rebuilding my testing scenario about an error in WindowStyle.hpp, line 40. Wat do?