SFML community forums
Help => Graphics => Topic started by: AIR on September 30, 2016, 03:59:21 am
-
I'm using SFML to develop a program for an experiment, which requires extremely precise timing.
I usually write these experimental programs using a graphics toolbox in Matlab. Using this graphics toolbox, the program loops once for every monitor refresh. For example, on a 60Hz monitor, each loop runs for precisely 16.7ms.
Unfortunately, when using SFML, the timing between iterations of the game loop is much more variable; some iterations can take as long as 40ms - even I'm not presenting anything but a blank screen. In this situation, the game loop tends to compensate in that the next iteration is considerably faster. However, in my case, this is more of a hindrance than a help, because the iteration runs through before the next monitor refresh, which essentially means that whatever I was going to display during that frame is missed completely.
I've used "window.setVerticalSyncEnabled(true)" in order to synch the game loop to the monitor refresh rate. I've also found that using fullscreen mode helps a little. Yet the problem persists despite these measures.
Is there anything else I can do to reduce variability in the duration of my gameloop iterations?
One suggestion I've seen is to split updating and drawing into separate threads, would this result in a more consistent framerate?
Someone has also suggested to me that the inconsistency is actually in the clock, rather than the fliptime - the idea being that the timing is being inconsistently reported. I was planning to use the clock to correct for missed frames by basing the display on the elapsed time between flips rather than the number of frames, but this won't work if the reported clocktime is inaccurate...Does anyone know how accurate the clock timing is?
Thanks for the help!
-
I don't have a direct answer for your questions but can you explain your experiment a little more? Sometimes additional details may help people find alternative solutions to your problems.
Does the logic in your program need to rely on the speed of your game loop? Can you not instead use something like a fixed timestep in the program's logic? Or do you really need your loops to run at a regular cadence for some other reason having to do with your experiment?
It sounds like your Matlab applications had V-sync enabled. in SFML, when using VSync make sure you aren't also using setFramerateLimit(). What kind of precision do you need in the clocks? I think SFML's clocks use the most precise time that the operating system can provide, but of course this still won't be perfect.
You mention that you are worried about iterations of your game loop happening before the next monitor refresh because you may miss out on displaying some things. Can you explain why you need to be so precise? For example when playing a game, a user wouldn't notice if something was displayed 1 frame too late.
-
To give you a little more detail, I work in cognitive neuroscience with a technique called frequency tagging. We oscillate the luminance of our stimuli at specific frequencies which induces a brain response to those stimuli at the same frequency. Any variability in timing adds noise the flicker frequency, which in turn adds even more noise to the neural response.
I could use a fixed time step to update the luminance at longer intervals, but ultimately the results will be better with a more clearly defined sinusoidal signal. I use a 144Hz monitor, so ideally I would alter the luminance of a stimulus the full 144 times within a second, with the same magnitude of change at each step.
I’ve been using V-sync without setframeratelimit(), so I’m on top of that at least.
My current workaround is to update the luminance according to the elapsed duration since the last iteration of the gameloop. This means that the change in luminance can be variable, but the flicker should at least approximate a sinusoid at the correct frequency. In order for this workaround to work, I would need the clocks to be accurate at least to the millisecond level. Do you know what sort of precision I could expect on a windows 7 system?
It’s been suggested to me that for really precise timing I should be using linux or even MS-DOS (:/). Any thoughts about whether the change might be worth it?
-
Are you using a Nvidia GPU? If so, create a profile for your program (or modify the global setting) and set "Multithread Optimization" to "Off". Try to see if that changes anything. This seems to be some common issue in combination with their drivers and programs using legacy OpenGL (such as SFML so far). Unfortunately there doesn't seem to be any easy fix other than switching to modern OpenGL (planned).
Also recently I noticed some random and (still not localized) stuttering that was somehow caused by the Windows joystick API and only disappeared after I had a gamepad connected for a moment.