I've especially heard people citing a very fast game like UT. I agree that UT is fricken fast, and 30fps will not cut it for smooth gameplay.. But this is how I understand the rest; The video card produces a frame and all elements of it are stored in the cards frame buffer, which is to say a non-static portion of memory, capable of holding more than one frame at a time. The monitor requests a new frame on the refresh rate signal, VRR being 60hz, every 16ms. It is pulled from the buffer and uses a rendering mode known as rasterisation to "paint" the image on screen.
All this mumbo jumbo said.. I can't see that's there would be a noticeable difference like stated, unless using a CRT with refresh rates capable of exceeding 100hz. Not that I'm saying you're wrong, Just rambling what I know and think. I've actually spent some time searching for a definitive answer on this, and come up far short. People claim they can see the difference between 100 and 200, no-one argues technically but anthropological restrictions, I can't understand it.