Seriously... don't let your game run at 15000 FPS
I love performance, and I always want to see how my changes impact performance, so I usually don't cap fps.
However, there are problems even when running on lower fps's.
For example, let's say I use the delta value for calculating my physics.
on 100 fps, the delta is 10.
on 91 fps, the delta is 11.
This is a multiplayer game, and 2 computers are connected; one with 103 fps, and one running 97.
I don't know how the value is rounded, but if it's rounded to closest whole number, this means that both computers will run the physics on a scale of 10.
The faster computer (103 fps) will run the calculations 6% more often, giving different results.
This is a huge problem for me, and I can't see a good reason to make the frametime less accurate.
Also, going only from 104 to 105 fps will increase the delta by 1, making the calculations ~10% faster.
This is assuming it rounds up to closest number, but the same still applies if the decimals just are cut off.
In any case, it is at the moment way too inaccurate.
[edit]
I don't know if i explained well enough, but i made a quick picture:
The blue line is how my physics are operating at different fps's with the new ms int value.