DVI or VGA

zeppelin03

New Member
I'm going to be looking into monitors down the road and see the monitors come with either VGA or DVI cables. Is there a difference between the two? Does one transfer data faster or something. My video card has and adapter piece so i can use either one.

Also, what are some specs. i should look for in a monitor. Im looking for something at about $250 or less.
 
DVI is a digital system, it has only 2 values, on or off (like the rest of a PC). Analogue however has many values in between the base and the peak so it can be more suseptible to interference.
 
Both VGA and DVI work fine for normal desktop monitors, however because DVI has a higher theoretical bandwidth, it can handle higher resolutions at higher refresh rates then VGA can.

If your monitor and video card support DVI, always go through that if possible.
 
I have an Acer AL2016W and am currently using DVI w/ Dual Link via EVGA 7600GT and Ubuntu 7.10, and it looks great.. there is a noticeable difference, at least in my case, between VGA and DVI..

So, I have to agree with the above, use DVI whenever possible.
 
A human eye cannot notice the difference under 16 ms between a monitor with 2 ms and 5 ms.
There are so many people on this forum that make claims such as yours. Some say you can't see a difference between 30FPS and 80FPS, others say you can't notice anything under 8ms. Where did you hear that claim Zangetsu, from someone else on this forum?

I can say with 100% certainty that I CAN tell a difference between 16ms and 4ms. My laptop had an LCD with a 16ms response time, while my 19" LCD has a 4ms response time. And on the laptop you could clearly see the ghosting that was happening.
 
I have a theory that may compliment on people that say they can tell differences as far as response time goes.

I've notice if I look at things which flash beyond the human tolerance to see flashing, looking directly, you cannot see it flashing. But when you change your eyes position, and the light is within visual range, you can see a drag, or a trail of light which stays in your vision memory for a few milliseconds, long enough to notice. This also seem to happen if you look at your finger and move it back and forth (as rapidly as you can follow) in front of the flashing light (CRT work very well).

The leds on my XBOX360 controller does this, when I look across the light, (it helps when it's dark) I can see a trail of lights indicating that the led is infact flashing rapidly.

I tried the finger thing on my CRT, I can see a trail of my finger, knowing CRTs redraw screens (like a flashing light), this proves that you can actually detect with your bear eye fast flashing lights, beyond the normal human telorance to see this.

Also, knowing the LCD don't redraw images, that means the pixels aren't flashing (when the picture is still), I tested my finger viewing method, and my finger seem smooth as it should be, showing that the LCD Pixels are infact not flashing.

Another test, is keeping your eyes still, staring at an CRT, waving your hand in front of you, you'll notice your finguers do flashy trails. But do this infront of a LCD (with still image) and your finger are smooth (it still trail due to vision memory, but smoothly, not "flashy" kinda trailling)

What does this all mean? Well, in heavy gaming, your LCD is flashing with colors! dues to the image change, and since your eye is never still, some people might notice the trail effect I was describing, consequently people claim to see a difference in response times due to this effect.

Give it a try, you might detect other LED or light sources you never knew was actually flashing! My alarm clock is one, the finger test (#1) proves it clearly. My Neon light in my computer, doesn't flash, that a continuous source of light! (Cold Cathode lighting). It helps alot if you try the tests when looking at the light source in the dark. Make sure you don't move your figer, or your eye to slow, because you won't notice the "flashing trail".
 
Last edited:
Back
Top