Hey guys,
I've always used sf::Clock to get the delta time between frames to allow for frame-independent movement. On a new project of mine, however, I have encountered some odd behaviour:
sf::Clock frameClock;
double timeLast = 0.1;
while (window.isOpen())
{
//This code works correctly.
const double timeNow = frameClock.getElapsedTime().asMilliseconds() / 16.66666667;
const double delta = timeNow - timeLast;
timeLast = timeNow;
sf::Clock frameClock;
while (window.isOpen())
{
//This code works incorrectly; "delta" is all over the place and my sprites move at visibly different
//speeds at different times.
const double delta = frameClock.restart().asMilliseconds() / 16.66666667;
This is very strange considering I've used the latter method in other projects where it worked correctly. The only difference between this project and the others is, in the others, frameClock is a member of a class.
I looked at the source of sf::Clock on git and it appears to be doing exactly what the former method does. I tried to debug using VS2012 but I don't have pdb files for SFML and I'm unsure as to how to acquire them although I presume one has to compile from source.
For my entire main.cpp, see http://pastebin.com/xK46pjyC.
First, you should use sf::Time objects instead of doubles. Since SFML has a dedicated class to measure times, this does not only make your code easier to read (because of fewer conversions), but also more typesafe.
Second, if you want to scale time, then do it directly before you pass it to the game logic. And use float, you don't need the precision anyway.
sf::Time delta = clock.restart();
gameLogic(scale * delta.asSeconds());
But in order to account for a special metric in your world, I would recommend to scale rather the velocity of your game objects than the time. Time passes at the same rate, no matter how big or small your world units are.