Well I love my framerates even more than graphics, so I've always made sure I've been getting at least 60fps on my 60hz monitors. When I get a new game I record the framerates and adjust the graphics until I see around 80fps minimum, so I have a bit of a buffer to make sure I don't ever fall below 60fps. This means I'm often averaging well over 100fps in most games and because of this I get constant 16ms frame times, +/- about 0.2ms of max variance, also means my heavily OCed strix 1080 running about 10% faster than stock can only run fortnite at medium settings, but I don't mind.
If I get a 144hz monitor, then I want 144fps, so I guess I would be looking at a minimum framerate of around 170fps and constant frame times of about 7ms, I may even get a lower variance than -/+ 0.2ms which I get now.
I thought the main purpose of free-sync and g-sync was to reduce frame time variance to avoid things like stutter?
I can just tell at 16ms frame times that I'm not watching true video, which does bother me, but I imagine that over
halving the frame times will go a long way to convincing my brain that what I am looking at is true video, so why do I need something to fix a tiny 0.2ms variance that I can't even notice or am I missing something?