PLaying games with 75Hz sucks !

TonyBAMF said:
The only thing Vsync does is cap the fps your card can produce to the same number the Hz.

seems to kill my gameplay, always makes it look as if the game is stuttering/lagging. but i'm on an ancient gfx card so it might have issues with a newer montior/drivers.
 
4W4K3 said:
seems to kill my gameplay, always makes it look as if the game is stuttering/lagging. but i'm on an ancient gfx card so it might have issues with a newer montior/drivers.

It can be a hardware thing.

I have only owned 2 GPU with Vsync (Ati AIW 9700 pro, ATI X800XL) and they work great.
 
kobaj said:
What, the herts you run your computer at isnt the fps. The fps is how many frames the game can give. For example I run my monitor at 75 hurtz and the fps in halo is 30. It isnt because of delay or nothing cause in the game the fps can go up or down, and does alot. Also whats wrong with 75 thats what I run it at and doesnt hurt my eyes at all...I have a crt monitor if that matters.

If you have a refresh rate of 80Hz that means you have 80cycles per second, you send can only send 1 frame per cycle, no more no less, if you the curent game you are playing is at a 40fps average, this means that in two cycles the same frame is sent.
 
Vsync

Ok, you have a average of 100fps, but your refresh rate is at 75, this means that you are producing more framea then what san be sent this will get you a visual glitch like problem.

100fps/75Hz

Cycle 1, frame 1-2.25
Cycle 2, frame 2.25-3.5
Cycle 3, frame 3.5-4.75
Cycle 4, frame 4.75-5

Frames
1 red
2 green
3 blue
4 orange
5 black

vsync7ek.jpg


RAMDAC, Random acess memory Digital to anologe converter.
Frame, X amount of horizontle lines X amount of pixels wide

before the are sent they are drawn in a frame buffer, once the cycle is complete the frame is sent, now there is a problem with this though, if you video card puts out more fps then the refresh rate, the lines of the new frame overwrite the old ones before anything is sent, this can produce a odd video glitch.
Vsync simply makes sure the GPU cant make more fps then the selected refresh rate.

In order to see this glitch you need a GPU capable of making a higher fps then your refresh rate.
 
Vsync simply makes sure the GPU cant make more fps then the selected refresh rate
I agree here but I don't agree with you saying the extra rendered frames aren't sent to the monitor, if this was the case there woiuldn't be so many cases of scene tearing which is a clear case of more frames sent than the monitor can display
 
Cromewell said:
I agree here but I don't agree with you saying the extra rendered frames aren't sent to the monitor, if this was the case there woiuldn't be so many cases of scene tearing which is a clear case of more frames sent than the monitor can display

What you say is sort of true, but this tearing happens before the signal is sent to the monitor.
 
The tearing happens becuase a new frame is starting to overlay the first one that has not yet been sent to the monitor.
 
The tearing happens becuase a new frame is starting to overlay the first one that has not yet been sent to the monitor.
The first one has been partially drawn from the buffer the next frame is sent into the same buffer, since the image is partially drawn and now has new information for the still undrawn areas it results in a mismatch, a tear. Once a frame is in the draw buffer it has been sent to the monitor, therefore both frames are sent to the monitor.
 
Cromewell said:
The first one has been partially drawn from the buffer the next frame is sent into the same buffer, since the image is partially drawn and now has new information for the still undrawn areas it results in a mismatch, a tear. Once a frame is in the draw buffer it has been sent to the monitor, therefore both frames are sent to the monitor.

But combined in 1 frame, and the result would still be sending only 75 frames.
 
When I say what to expect, I say the real life true responce time of a monitor, which is always more then what the specs are.
First Example.
We have a game Unreal tournament 2004, the desktop can render this game at 110fps, sending the signal anologe @ 75Hz means you are only sending 75frames now, the Lcd monitor recieves those 75frames, but sence it has a delay of 25ms (30 true ms) it can only display 33 of those frames a second
.
You miss my point (about "expectation") but to illustrate using your example....

Suppose we have a game Unreal Tournament 2004, the desktop can render this game at 10fps..........the rest should be obvious.

The tearing happens becuase a new frame is starting to overlay the first one that has not yet been sent to the monitor.
Tearing can be monitor driven... thats why, if you take a setup that is currently experiencing tearing and stuff ... unplug the monitor ... pop a better on in ... and issues go away....
 
Last edited:
Cromewell said:
Not combined in 1 frame, combined in 1 monitor update. There is a difference.

Isn't that update called a cycle and doesn't that cycle start at the RAMDAC?

BTW I am not trying to make a flame war.
 
It doesn't matter what it is called, the frame has been sent to the draw buffer overwriting the previous one even if it is in mid-update of the monitor picture.
therefore both frames are sent to the monitor
that neither frame is complete doesn't matter they were both sent to be drawn by the monitor
 
Without reading any of this I'm going to post :P. I play all my games with the monitors refresh rate set too 75hz and no it does not suck.. Crt monitor by the way.
 
It's funny to hear people complain about 75Hz CRTs and such ... and then go watch a 60Hz TV.....
 
Praetor said:
It's funny to hear people complain about 75Hz CRTs and such ... and then go watch a 60Hz TV.....

:), if they were TVs at 85Hz, I bet you a monkey that they would complain about the 60Hz
 
It's up to your equipment. I have a 22" CRT (viewsonic p225f) and my resolution is set at 1600 by 1200 with 85 hz refresh rate.
 
Back
Top