So, for some reason, in our C# project, RenderTextures are
incredibly slow to create. In fact it's so slow that the delay from attempting to make one caused dropped frames which is not nice. Resizing the window (and all the 11 render textures that needed updating) basically hung for a good 5 seconds straight.
So I compiled CSFML in debug mode to get some PDB files, attached Visual Studio's profiler, aaaand...
See, if this was a more complicated call stack I would've just admitted defeat or asked for help. Except it's literally one function causing all horrible CPU usage. I was immediately suspicious that these should be cacheable. So I went and removed the
selectBestPixelFormat from
createSurface and made it use the same pixel format as the main context.
Resizing the window became about as instant as Windows supports and there were no issues with render textures anymore.
I pushed the change to my fork on GitHub, see it here:
https://github.com/PJB3005/SFML/commit/d960c4d670cd169e5f45d95731e5bbde1571ae9e. It's pretty bad and I didn't intend for this to go anywhere except me going "wow that worked", and it genuinely worked too. Problem is I can totally see this breaking in case something passes context settings that are different from what we've got in the shared context.
I tried to recreate the bottleneck in a test project and it's a lot less noticeable, although it's definitely still
there. The test project is just a basic window doing a hue shift over time using
clear(). There's 10 render textures that resize when the window does. Resizing without my change is still fast but it's
slower if you compare it.
I don't know what we did to cause the OpenGL driver to become this much slower at choosing pixel formats, but it's definitely the NVidia driver (GTX 970 in case that's relevant) and not SFML's fault.
So now I'm wondering whether there is something I'm doing wrong or whether it'd be a good idea for SFML to implement some rudimentary proper caching of the pixel formats to prevent this bottleneck. I can try implementing it myself properly, though I don't know C++ well (that hack was the first C++ code I've ever written outside UE4).