I'm making a simple shoot 'em up using SFML,applying what I'm learning from the "SFML Essentials" book,and in this book I learned that movement should be dependent on time rather than framerate,to insure a consistent experience across all different hardware.
Now,I have a general understanding of how both the
Clock and
Time classes work,but when it comes to applying them to enemy movement,I'm not sure exactly how to do that,and the book simply didn't linger on this issue long enough for me to understand,focusing more on animating sprites rather than moving them...
There are two was I thought of when it comes to approaching this,but I'm not sure of either of them:
1.I include all the
move functions inside an
if statement,so that the movement is only executed if the time elapsed between this frame and the one before it is above/below a certain threshold,but I don't know what the condition will be,as well as which unit of time to use:
sf::Time delta_time = clock.restart();
/*Which one of these three do I use?*/
float dt_elapsed = delta_time.asSeconds();
float dt_elapsed = delta_time.asMilliseconds();
float dt_elapsed = delta_time.asMicroseconds();
if(/*What would be the condition here?*/){
background.move(sf::Vector2f(0, 0.1 * dt_elapsed));
enemy.move(sf::Vector2f(0, 0.15));
}
This does work fine if I
dt_seconds stores time in milliseconds,and the conditions is something like
if (dt_elapsed >= 200) but the higher that number is the slower the movement becomes,and I don't know what the right value would be.
2.I multiply the chosen movement speed with the elapsed frame time,which is actually what's suggested in the book:
sf::Time delta_time = clock.restart();
/*Which one of these three do I use?*/
float dt_elapsed = delta_time.asSeconds();
float dt_elapsed = delta_time.asMilliseconds();
float dt_elapsed = delta_time.asMicroseconds();
background.move(sf::Vector2f(0, 0.1 * dt_elapsed));
enemy.move(sf::Vector2f(0, 0.15 * dt_elapsed));
Again,same problem,which unit of time do I use? Depending on whether I use seconds,milliseconds or microseconds the shape in question would either barely move or zoom across the screen (The book actually suggests I use seconds,but it doesn't linger over this specific issue for as long as I was hoping).
I could multiply the speed value with the elapsed time divided by a certain value x if I'm using millie/microseconds,or I could multiply the speed value directly with x,making it much bigger if I'm using seconds (We're talking values of somewhere between 150 and 200 instead 0.15 or 0.11,but is that well optimized?),but in each case I still don't know what x would be,and I don't want to just increase/decrease it until I get something that looks functional.
TL;DR How do I use the elapsed time between frames to obtain consistent movement speed across all different hardware,while making sure it's something well optimized?