Moniter Blackout: DVI Cable or Moniter

Necopotence

New Member
I just bought a new Viewsonice 19" Widescreen Moniter from newegg.com. It totally rock, best moniter I've ever had. Of course, as is always the case, there is something wrong with it that I can't seem to pin point. I know it either has to be the moniter itself or the DVI Cable.

When I turn on the moniter or begin to play any king of game, the screen will black out. It keeps power and sometimes it is out for long enough to get the "No Signal" warning. I've already hooked up my old 17" CRT moniter while the screen is blacked out to make sure it wasn't the video card.

I was going to simply go buy a new DVI cable to see if that was the problem but they cost $70 bucks, that alot of money for something I should be able to get for free, don't you think? If I RMA it though the postal service (not UPS, etc) it could take over a month for this to be resolved.

Does anyone have any idea if I could fix this prob. myself or is an RMA in order? Here is a link to the Moniter on newegg:

http://www.newegg.com/Product/Product.asp?Item=N82E16824116373

Thanks for any advice you can give!
[email protected]
 
I doubt it's the DVI cable. I beleive if something was wrong with the cable itself, you'd have problems all around(missing colors, no video everywhere, etc)

I'd first check the monitor by using VGA if at all possible. I assume that monitor has a VGA port on it and most video cards have DVI to VGA converters. If not...no biggie I guess. I'm just wondering if it's monitor related as my first LCD actually quit displaying certain resolutions. One being the resolutions the windows boot screen runs at. Eventually no resolutions worked...

However, as Ku-sama mentioned, it's probably something with the video settings.
 
or it could be that you have the game resolution up to high and the LCD cant pick it up

I remebered that I have a VGA to DVI converter and a my moniter supports VGA and came with said cable. I ran it on analog just fine and its going on a few hours with no problems. So either the cable is bad or the DVI port is bad. I can the resolution is always set too 1400x900 (Widescreen) for all the games and even the desktop.
 
I doubt it's the DVI cable. I beleive if something was wrong with the cable itself, you'd have problems all around(missing colors, no video everywhere, etc)

I'd first check the monitor by using VGA if at all possible. I assume that monitor has a VGA port on it and most video cards have DVI to VGA converters. If not...no biggie I guess. I'm just wondering if it's monitor related as my first LCD actually quit displaying certain resolutions. One being the resolutions the windows boot screen runs at. Eventually no resolutions worked...

However, as Ku-sama mentioned, it's probably something with the video settings.

I used the VGA port with a DVI Converter and I've yet to have any problems. I can notice only a subtle change, but is analog not as clear as DVI in picture quality?
 
Very odd... I wouldn't think the cable would cause that... Do you have any way to test that monitor on another machine using the same game and video settings?
 
Very odd... I wouldn't think the cable would cause that... Do you have any way to test that monitor on another machine using the same game and video settings?

My computer is about the only one that can run that game at that resolution (1400x900). Like I said, it might be a bad DVI port and not the cable?
 
and what resolution is the game set at? ( if its more than ....x900 the monitor will black out, since it doesnt support the resolution)
 
and what resolution is the game set at? ( if its more than ....x900 the monitor will black out, since it doesnt support the resolution)

It is set to 1400x900, the resolution of the desktop and the standard widescreen resolution for the desktop. And like I mentioned before, the game works just find when using the VGA port. I just don't know the pro or cons of using DVI or Analog.
 
and you use the same monitor then?

Edit: Because you can have different resolutions in game as on your desktop you know... try setting the in game resolution to the lowest setting,and then try the LCD monitor again perhaps?
 
I thought he said it was the same setting as his desktop...?

Are there any other setting that might cause a problem? Perhaps Hz? I know a few games allow you to set that. Otherwise, I can't imagine why it would be different, if you are using the same as you desktop resolution...
 
I acutally waited for the problem to occur, and while it was happening I hooked up my CRT and it worked fine. I have both the game and desktop set to 1400x900 and in game I have the refresh rate set to 75mhz.
 
I acutally waited for the problem to occur, and while it was happening I hooked up my CRT and it worked fine. I have both the game and desktop set to 1400x900 and in game I have the refresh rate set to 75mhz.
 
well... tried putting it to 60Hz? since it doesnt matter for a LCD anyway?

LOL, you posted that just as I posted the other comment, I had no chance to answer the question. And yes, I've already switched the MHZ refresh rates around and it still does the same thing.
 
Ok,.. just a question.. have to tried running the game with all video settings setted to the lowest or default settings? ( put up those settings using your VGA monitor) also, your card should have come with a DVI->VGA adapter.. could you try to plug in your VGA monitor onto the DVI port and try to run the game ( if that works, it would rule out the grafice's card's DVI port beeing defective :) )
 
Back
Top