Nvidia GeForce 8800GT / Rosewill500w psu

StrangleHold

Moderator
Staff member
16A + 17A= 33A. You can add the +12v rails. It's fine. I believe the card needs around 26A. Just don't go sli, but your motherboard can't anyway.

http://www.computerforum.com/90110-power-recommendations-video-card.html

This must be what scOut was mentioning:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817371016

44A on the rails and Antec has a great reputation.....

You dont just add the 12V rails amps. together to get a total. All that means is thats the most any one rail can pull. To get a close total if the maker doesnt give the total amps is to take the total watts on the 12V rails and divide it by 12.
 

CPTMuller

New Member
DVI is a much higher bandwith connection relative to VGA and the image quality on high resolution monitors looks relatively bad if you have to use an adapter.. However it is by no means bad, and it is still pretty good.
 

EndofAll

New Member
Well my monitor has both VGA and DVI inputs in it. So would it be wise to just like use the adapter till i can get enough money for a DVI-D to DVI-I cable?
 
Last edited:

CPTMuller

New Member
I'm not sure, I know there is a more basic DVI cable you can plug into DVI-D I just am not sure if that is DVI-I or not.
 

EndofAll

New Member
I really need to know if if i can plug a DVI-D male in to a DVI-I port on the graphics card and it will work optimally without the VGA slowing me down.
 

Itronix

New Member
You dont just add the 12V rails amps. together to get a total. All that means is thats the most any one rail can pull. To get a close total if the maker doesnt give the total amps is to take the total watts on the 12V rails and divide it by 12.

Oh yeah, I was told that you could add them for a rough estimate or something, but I think I was also told about the watts/12. Thanks, it's a good to know for the future. I told you I am no expert :D!
 
Top