Enough watts?

Guru3d's power consumption test showed a sub 600W draw on a system with an overclocked i7 and twin 470's.

How does that work? :confused:

i7 950 = 130W TDP
x58 = 30W TDP
GTX 470 = 215W (x2 = 430W)
RAM = up to 20w a stick (60W)
HDD = ~10W
And we'll say 40W for the fans and any other gizmos that happen to be functioning.

total = 700W, before overclocking
 
Seems like that'd be the case in a high-stress game, no? Not exactly 100%, obviously, but getting up there.
 
How does that work? :confused:

i7 950 = 130W TDP
x58 = 30W TDP
GTX 470 = 215W (x2 = 430W)
RAM = up to 20w a stick (60W)
HDD = ~10W
And we'll say 40W for the fans and any other gizmos that happen to be functioning.

total = 700W, before overclocking
I'm sorry, but this is ridiculous.

For starters, no HDD will ever consume anything near 10w. Also, a standard RAM DIMM will consume <2w per module, depending on voltage. Since DDR3 has low voltages, we're looking at 1.5w per module.

It's amazing how much overkill people recommend here for power supplies. Having a power supply that is hundreds of watts more that what the user needs isn't good. It drops efficiency dramatically, not to mention wastes money. And no, games don't stress the system as much as people seem to think. Only the graphics card is really put to the test. He won't be even close to 100% load in any game
 
Last edited:
Back
Top