50W Below Video Card requirement a big deal?

DarklyDivine

New Member
Would really appreciate any insight. I have one of those cookie cutter HP computers that come with a 250W power supply, and I am having issues with getting PSUs to work in it. However, my friend told me he was something like 50 or 100 W below the required power supply for his ATI video card (256mb) and it worked fine for years. I want to get an nVidia card, nothing crazy just a 512 mb perhaps, but most of the cheapest ones on store shelves do not go below 300W required. Is 50W a big deal? Can my computer run fine regardless?
 
If you upgrade the GPU you will need to Upgrade the PSU as well.

I am not a PSU Guru by any means but from what I know the OEM PSU's will not put out nearly what they are rated for. Also, if the PSU does spike to hit that 250w that it is rated at it will not last long at that wattage. It would only be a matter of time before it blows and most likely takes other components out with it.
 
The wattage listed is the peak wattage used if you run everything. I assume he did small tasks, not trying to play multiple blu-rays while loading a high end game at once on that computer? If you max it out, it will damage itself and other components. Also, if you max it out and the video card tries to pull more power, you will damage the card by heating it up. I've had this happen and luckily, ATI replaced it for me for free.
 
You will want to upgrade your PSU for sure, especially since it is the stock HP unit. My experience with HP PSU's is that they are junk.
 
1. You will need a new power supply

2. There is no point whatsoever in getting a card unless you go for a mid range, you will see virtually no difference in application over onboard, as the card will only really be fit for playing videos and not a lot else, certainly not gaming if there is that little power.

What is the point in spending money just to have the power supply blow and have to fork out on that as well? And as is the case when a power supply blows, and even more likely when the unit is low quality like yours is, other components will be damaged too, so for the sake of saving $30-40 for 0 performance gain you are going to be out of pocket due to buying an entire new system.

Assuming your computer is standard ATX/mATX form factor and uses ATX connectors:

http://www.newegg.com/Product/ComboDealDetails.aspx?ItemList=Combo.784994

Is a hell of a deal, especially with £30 rebate too. you will have a more solid power supply as well as a much, much more powerful video card.

If that is outside of budget then save because much less really isn't worth the money
 
First thing about video cards, amount of memory means nothing besides not losing performance at high resolutions (above 1920x1080). Anything that resolution and below, 1GB will fit 85% of games. The most intensive will need 1.5GB and preferably 2GB of VRAM to avoid stuttering but you only get that much on the most high end cards. Most cards come with 1GB as of now, though. 1280MB and 1536MB cards are also common. But as for power requirements, they are generally a minimum for the card including the rest of the system. So if a card says "500w minimum" it means for that card plus the rest of your system.

To find real power draw you need to find the TDP of said video card. It's quite easy and can be found in reviews of video cards. That number will be in watts. Compare it to the amount of wattage on your PSU to contemplate if it can handle it. Also divide wattages by 12 to get amperage values.
 
Back
Top