Currently I reference the App.GetFrameTime() anytime I move a sprite around on the screen.
App is an sf::RenderWindow object.
For moving on the x axis, I have a float that I append to x depending on the direction that the sprite is supposed to go. It looks something like:
offset = App.GetFrameTime() * 0.25f;
And this works fine! It seems to be consistent for all of the test cases. However, currently for jumping I'm trying:
grav -= ((App.GetFrameTime() * 1.5f) + 0.5f) / factor;
Also, every iteration I add grav to y if you're not touching the ground.
This isn't very consistent, occasionally the character jumps very high as the App.GetFrameTime() is constantly changing. Initially I was trying:
grav -= ((jMax * 3) + 0.5f) / factor;
//jMax = 1.3f;
Which worked fine for myself, but for other test cases the jump is very tiny.
I cap the FPS around 400 frames per second. The game runs very smooth for me, but for some of my friends it runs around 100 FPS and the jumping is very tiny. The x axis movement with the offset variable defined above seems to be fine though.
What could I do to possibly make it consistent for everyone? Thanks for any help!