Real power consumption of video cards

nj473

Member
Hi,

I am thinking of upgrading my current Radeon 6850 to another one, perhaps the RX 270X to keep up a little with today's (and future) games. However I am a little concerned about the amount of power it will draw, as in the cost of running it, NOT whether my PSU is capable of running it.

I saw on "hwcompare" that my current card draws around 130w, but this new one will draw about 170w. My question is this:

Does that figure represent how much power the card draws all the time? Or just during Idle/full load? Does the amount of power vary at all with video cards, or is it just a constant amount that is used from the moment the PC is turned on?

It could influence my decision on buying a new card because I don't really want such a high power consumption when using the PC for things other than gaming (less strenuous tasks).

Thanks in advance
 

Geoff

VIP Member
The amount of power varies depending on the load, when it's idle it still uses power, but far less than you would use at full load while gaming.

I don't know the price of electricity in your area, but I couldn't imagine a GPU that consumes up to 40W more on full load would make a noticeable difference in your energy bill unless you game a LOT each day.
 

nj473

Member
The amount of power varies depending on the load, when it's idle it still uses power, but far less than you would use at full load while gaming.

I don't know the price of electricity in your area, but I couldn't imagine a GPU that consumes up to 40W more on full load would make a noticeable difference in your energy bill unless you game a LOT each day.

Thanks for the response, I was worried that there would be a constant high power draw no matter what, in which case I think it would make more of a difference. But you're right, if it will only use this much whilst gaming/full load, then 40w is fine.

I don't game so much anymore which is why it would be pretty annoying if it was always using a lot! :)
 

Geoff

VIP Member
Cards do use different amounts of power when idle as well, I highly doubt it would be 40W difference when idle, but chances are there is a slight difference. You're better off using your computer 15 minutes less per day haha
 

beers

Moderator
Staff member
I think the newer ones even have a lower idle state for 'Zerocore' that was a part of the GCN architecture (whereas the 6850 is VLIW), so if anything you'd see less draw at idle.

You can figure out the cost difference by converting to kWh and multiplying by the rate on your bill. The difference isn't hugely gigantic, although even 50w 24/7 for an entire year at $0.15/kWh ends up being something around ((50/1000)*24*365*0.15)=$65.7/yr
 

nj473

Member
Thanks for the replies guys I'm pretty sure I'll just get one now knowing this. :) (when my wallet permits anyway)

Just out of curiosity though, anyone have any idea roughly how much power cards do draw when idle? Not important, I just have no comprehension of the matter.
 
Top