How the heck do I use my monitor DVI input?

massahwahl

VIP Member
I have an acer 22" monitor that has DVI and standard monitor inputs on it but I cannot figure out how to use just the DVI input. When I go into the monitor menu and click it to digital, it flashes for a sec then comes back but its still in standard. Is it a setting a need to change on my video card or what? My card is a Geforce 6700 GS
 
DVI does not make your monitor look better... It basically .. well here:

In regards to DVI (digital video interface) vs VGA (anaglog) differences on various monitors, it's going to depend on the monitor and what you use it for. If you get an expensive LCD monitor, which can fully utilize the digital input, then it will probably be a better viewing experience with a digital signal. But other than that, I don't know that I've noticed the difference much. Here's a link to a good discussion about the subject. You'll notice a link posted by one of the respondents in the discussion to Tom's Hardware. Their opinion is that there really isn't much difference in the two displays.:

http://forums.cnet.com/5208-6121_102-0.html?forumID=45&threadID=1693&messageID=19774
 
Well the extra VGA is why I wanted it to begin with... I still dont know how to switch to DVI though...

How do you not know how? You just use a DVI cable instead of a VGA cable and connect it to the monitor and video card, just like you are not with the VGA cable.
 
I did that but its not getting a signal. I tried switching the monitor to digital mode but it switches right back.
 
If I'm not mistaking, you must use true digital-DVI to use wide screen. Otherwise there's probably no real difference. I know I can't tell any difference with my 17" SD monitor. And my 19" HD LCD TV has VGA input, but is limited to 1280x1024, or something like that(not wide screen...) Sometime I'll try an HDMI to DVI converter and see if I can get the TV's full resolution.
 
I'm using a 32 inch HDTV Monitor LCD - I can't tell the difference between the vga and DVI, I am using VGA right now, cause I have 2 pc's hooked up to this monitor and use the dvi for my other pc which i rarely use (server.) I use the VGA for the everyday PC because I haven't really noticed any difference at all, some of these new technologies don't show drastic improvements, they are just implemented to get the consumer to buy new cables, and accessories.
 
Ok so DVI sucks I get the point, but I need to use it so I can have a free VGA port thats why im asking. I just need to know why my monitor does not recognize that I have it plugged in using DVI. Is it a setting on my monitor or video card I need to change?
 
Ok so DVI sucks I get the point, but I need to use it so I can have a free VGA port thats why im asking. I just need to know why my monitor does not recognize that I have it plugged in using DVI. Is it a setting on my monitor or video card I need to change?

DVI and VGA display the same, just DVI is an bit clearer, better, ect.
 
DVI becomes worthwhile in gaming under certain situations. The higher-end Samsung Syncmaster TFT LCD monitors, for example, are perfectly capable of a 2ms gray-to-gray response time using DVI. When you use VGA, they only do something like 5ms. That doesn't seem like a lot, but in a game or movie environment, if the screen is mostly dark and black and there are some lights moving because of your changing perspective, you'll notice marked "trails" behind the lights at a 5ms or slower g2g, and nothing at 2ms w/ DVI. I know; I own a Syncmaster 206BW, and tested it using Half Life 2: Episode One and a copy of Miami Vice. There are other small image quality differences, too, but on a monitor smaller than 30", the average user won't notice them.

Whether or not to go with DVI is a personal choice. If all you want to do is surf the internet and do some word processing, VGA is a fine cost-effective option. If you're a gamer, or you're building a media center PC, DVI is well worth the additional cost. Check the specs of the monitor you plan to buy. If they don't tell you the GRAY-TO-GRAY response time, don't bother if you're in that gaming/movie market. And then check to see if they tell you response times are different. I can highly recommend the Samsung 206BW. For about $275 these days, you can get a 20" widescreen monitor that kicks the crap out of $500-$750 uber-screens. Interesting note: The 20" 206BW has the same pixel count as the 22" version. And mine shipped with one dead pixel that I only saw when I ran the pixel checker program, haven't noticed it since. The box also included a high-quality DVI cable, *and* a VGA cable.
 
Last edited:
Ugh... I give up... Somewhere along the way people stopped trying to answer my actual question. Thanks for all the DVI info, dont get me wrong, but I never really cared if it was better or not, just why it was not working.
 
Ugh... I give up... Somewhere along the way people stopped trying to answer my actual question. Thanks for all the DVI info, dont get me wrong, but I never really cared if it was better or not, just why it was not working.

Is there no display when you use your DVI cable?
Or the display quality is the same when you use VGA?

People are answering that DVI and VGA is the same and the DVI info is because I believe they have the impression that you are comparing the two display and you didn't find it satisfactory or as you had hoped for.
 
That could be Bert, Let me clarify then. When I plug my computer in by DVI alone then I dont get a signal. Thats the problem, not which one is better.
 
If that is the case, then there is a possibility that its a bad cable.

Have you tried plugging it the other way around?? :D :D (I know it sounds silly :P)
But yeah... as I said, could be a bad cable. Do you have any spare one that you could test out with?
 
Back
Top