SFML community forums
Help => General => Topic started by: eigenbom on April 23, 2014, 03:45:54 am
-
Does anyone know how to programmatically force Windows to select the better GPU in a dual graphics system? (e.g., a laptop with a nvidia chipset and intel integrated graphics)
My SFML-based game seems to use the integrated graphics, unless I explicitly right-click the executable and choose "run with nvidia graphics". Running with the good graphics improves the frame-rate dramatically, so I'd like to know if there's a way to make this the default for anyone running the game? (Also, ideally I don't want to bundle an installer with the game.)
-
Usually the user sets the default gpu to run stuff on, not the programmer.
I haven't used Windows in a while, so I'm not 100% sure, but I believe you do it through the Nvidia settings.
-
Thanks for the reply.
Yeah I've manually set it on my own system to use the fast GPU for my game, but most games that I run use it by default -- typically the indie games I run don't use it by default. I suspect there's some program that you run in an installer to tell the nvidia card your game needs to use it, but I was hoping there was some environment variable or some other method that can avoid using an installer (at least for the beta versions.)
-
You usually get a systen wide setting somewhere, not just on application basis.
Howevef if there is such an option, it won't be cross-platform and SFML doesn't provide it, thus you might want to ask somewhere else (e.g. SO).
-
No worries, thx. I'm surprised no-one has encountered this before -- a forum search shows nothing. Even running the SFML demos will use the integrated gfx, so by default will run a lot slower for most people -- but I suppose they aren't really stress tests so maybe nobody notices? In any case, I'll research this further on other sites.
-
No worries, thx. I'm surprised no-one has encountered this before -- a forum search shows nothing. Even running the SFML demos will use the integrated gfx, so by default will run a lot slower for most people -- but I suppose they aren't really stress tests so maybe nobody notices? In any case, I'll research this further on other sites.
lol not stress testing? ??? You should see how many sprites are on my screen at any given time. :)
-
No worries, thx. I'm surprised no-one has encountered this before -- a forum search shows nothing. Even running the SFML demos will use the integrated gfx, so by default will run a lot slower for most people -- but I suppose they aren't really stress tests so maybe nobody notices? In any case, I'll research this further on other sites.
lol not stress testing? ??? You should see how many sprites are on my screen at any given time. :)
I meant that the SFML demos aren't stress tests. I realise many people are doing intensive things with SFML, which makes it weird that this issue hasn't seemed to come up.
-
No worries, thx. I'm surprised no-one has encountered this before -- a forum search shows nothing. Even running the SFML demos will use the integrated gfx, so by default will run a lot slower for most people -- but I suppose they aren't really stress tests so maybe nobody notices?
Or not everyone has two GPUs and those who have might not all have that issue. ;)
On how many different setups did you test?
-
No worries, thx. I'm surprised no-one has encountered this before -- a forum search shows nothing. Even running the SFML demos will use the integrated gfx, so by default will run a lot slower for most people -- but I suppose they aren't really stress tests so maybe nobody notices?
Or not everyone has two GPUs and those who have might not all have that issue. ;)
On how many different setups did you test?
Many laptops are dual graphics these days. I think the Macbook Pro's switch automatically whenever OpenGL is used in an application, but unfortunately it doesn't seem the same for Windows. Nvidia switches whenever it detects a DirectX or Cuda call, but not OpenGL call; however there's an nvidia library you can use to create an application profile. In case anyone stumbles across this problem I'll leave these links here:
http://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/group__drsapi.html (http://docs.nvidia.com/gameworks/content/gameworkslibrary/coresdk/nvapi/group__drsapi.html)
http://www.opentk.com/node/3144 (http://www.opentk.com/node/3144)
-
I think the Macbook Pro's switch automatically whenever OpenGL is used in an application
Yes, on Mac this is automatically handled by the OS.