Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Weird Vsync CPU consumption?  (Read 2905 times)

0 Members and 1 Guest are viewing this topic.

sknnywhiteman

  • Newbie
  • *
  • Posts: 38
    • View Profile
    • Email
Weird Vsync CPU consumption?
« on: July 17, 2012, 04:01:32 am »
So, when I'm running my game and I use Vsync, it uses all of one core to render my game screen even though I'm sleeping for 16 ms at a time. I have a solid 60 FPS and everything is smooth, but when I take Vsync off, it uses little to none CPU, and I found the reason why after searching the forums a little:
http://en.sfml-dev.org/forums/index.php?topic=7732.msg51443
Basically, if you don't want to read it, the creator of the thread said that the problem is in nVidia control panel and turning threaded optimization off. I tried it, and it works perfectly.  They also figured out that the problem isn't in AMD cards.
My question is, (because when I went in to the control panel, it was "auto") are we able to somehow state that we want it off for nVidia users?

binary1248

  • SFML Team
  • Hero Member
  • *****
  • Posts: 1405
  • I am awesome.
    • View Profile
    • The server that really shouldn't be running
Re: Weird Vsync CPU consumption?
« Reply #1 on: July 17, 2012, 05:35:09 am »
After a bit of reading and a lot of good will, I come to the conclusion that the majority of nVidia users report that they are better off disabling this "Threaded Optimization" completely. Why this is called optimization? Nobody knows. Allegedly it is supposed to magically make older single threaded games run faster on multi-core CPUs? This comes at the cost of breaking just about everything else, including SFML as you have experienced. I wouldn't mess around with detecting whether an nVidia GPU is present or not and setting stuff differently. It just makes it a headache later on when you are testing stuff that relies on this split behavior. As I always like to say: keep it simple, fast and portable.

In case you are really really keen on finding out whether the user is using an nVidia GPU and are willing to break everything there is to break:
#include <SFML/Graphics.hpp>
#include <SFML/OpenGL.hpp>
#include <cstring>
#include <iostream>

int main() {
        sf::Context context;

        const char* vendor_string = reinterpret_cast<const char*>( glGetString( GL_VENDOR ) );

        if( strstr( vendor_string, "ATI" ) ) {
                std::cout << "You are using an ATI graphics card!\n";
        }

        return 0;
}
Just change the ATI to nVidia or whatever works for you. I have an AMD card so couldn't test myself. As always with these kinds of things, YMMV.
« Last Edit: July 17, 2012, 05:37:45 am by binary1248 »
SFGUI # SFNUL # GLS # Wyrm <- Why do I waste my time on such a useless project? Because I am awesome (first meaning).

Jove

  • Full Member
  • ***
  • Posts: 114
    • View Profile
    • http://www.jestofevekites.com/
Re: Weird Vsync CPU consumption?
« Reply #2 on: July 17, 2012, 10:07:30 am »
Quote
I found the reason why after searching the forums a little:
http://en.sfml-dev.org/forums/index.php?topic=7732.msg51443

I now have an ATI card and it will still do this, but only when using the W7 Basic theme and running in windowed mode.

I don't think the reported CPU usage is correct anyway, I remember pushing the number of objects to silly proportions and the frame-rate stayed solid 60 all the way with no slowing.

It's a non-problem, really. Perhaps annoying if you're on the lookout for real CPU spikes in your code, but like I said you can change the desktop theme for that.
{much better code}

sknnywhiteman

  • Newbie
  • *
  • Posts: 38
    • View Profile
    • Email
Re: Weird Vsync CPU consumption?
« Reply #3 on: July 17, 2012, 03:02:14 pm »
I'm not too worried about it, but its definitely an inconvenience because I have no clue how much CPU my loop takes when my rendering is taking all of it. :(
Well, I'll just ignore it I guess, thanks for the help! :D

 

anything