The problem is that you compare relative values, not absolute ones.
The only relevant information is the amount of milliseconds that an operation takes to execute.
Let's say that a small operation (an event, or drawing one sprite) takes 1 ms. At 1000 FPS, this makes a huge relative difference, percentage or not. At 50 FPS, it's not significant anymore. And it's not the same percentage, although it's the exact same operation which takes the exact same amount of time.
You can't say that adding one sprite will always lower your FPS by 10%. But you can say that it will always eat 1 ms from your game loop.
As long as I know what the value is relative to, I also know how many ms the added operation takes.
In any case, lets' drop that discussion.
I'm interested though, how did SFML get the float value in the old implementation?
I guess I could make my own class that looks like it, as I still want to be able to show a fps counter in my game (with higher precision than 59fps, 63fps, 67fps that ms gives).
[edit] Nevermind, I'll check the source for the old version.