From what I know vSync simply syncs the frame rate to the refresh rate of the screen (or something like that) so it's a good frame rate
First part: Yes. Second part: Maybe? I believe the primary goal of vsync is to keep the program and monitor in sync, which prevents screen tearing. Not quite the same thing as having a "good" frame rate, though it does happen to work out that way.
Whether vsync "works" thus depends on timing properties of both the graphics card
and the monitor(s), in addition to all the usual stuff like your program and the OS and the user's perception. I would be shocked if there's a reliable, platform-independent way of judging this that works well enough that none of your users will ever feel the need to flip a vsync setting in the options menu themselves. As a random example: Chrome uses vsync by default, but I still see screen tearing in streamed online video every day. I have no idea why this is the case, but I'm sure it's far too complicated for me to understand.
Even if there was a way to detect this, vsync can have downsides (input lag, huge perf drops if the timing is just wrong, etc) which you simply can't measure the impact of, so you'd still want it to be an option users can disable if it doesn't work well for them. Triple buffering adds another layer of complications and trade-offs.