watt TDP

Oats

New Member
Is it correct to say that if I am at 100% load without overclocking, watt consumption of the CPU and GPU will never go above their watt TDP? If so, how close would you think actual watt consumption would get to their listed watt TDP at 100% load without overclocking? (rough estimation)

How easy would it be for the CPU and GPU to consume more watts than their watt TDP? would it "usually" require something like light, average, or heavy overclocking?
 
TDP is thermal power disapation, not load electrical.

However they're good indicators.

Also, the power consumption is not a linear relationship with OC or heat. In fact I think its a RMS equation from memory.
 
The wattage consumption of a CPU/GPU has nothing to do with its TDP. TDP is nothing more then the amount of heat being thrown off the CPU/GPU that the heatsink needs to be able to dissipate. TDP is really the power dissipation rating in wattage.
 
TDP is thermal power disapation, not load electrical.

However they're good indicators.

This is what I have always been getting with other peoples comments and benchmarks, but my problem is that I never really had anyone straight up confirm it.

"TDP, representing the thermal design
power, is a rudimentary indicator of consumption, and may leave
customers overestimating their data center infrastructure needs."

^
this is exactly what I want. I want something close to actual consumption at load but I also want to overestimate for safety reasons.

basically, it appears there is a pattern with watt TDP and actual consumption. They aren't going to give a GPU a TDP of 170w when actual consumption at load is 50% higher or lower than 170w, they are going to give it a TDP of 170w because actual consumption (at load without overclocking) is ALWAYS going to be lower than 170w by perhaps 20-40% or so.


For example, this is useful for when a beginner wants to only upgrade their GPU but they don't know if their PSU has a high enough watt limit (or more specifically, the amperage of it's 12v rail) to handle a much better graphics card. To answer this, we can use the watt TDP of a GPU as a "rough estimation" of the absolute max amount of watts it "could" consumed at stock speeds and add it to our estimated consumption for the rest of the system at load... now we have a number we can compare to his PSU 12v rail's watt limit. Utimately, we want to recommend one of the best graphic cards his PSU can handle (if possible) but we also want to overestimate max watt consumption at stock speeds to leave room for certain things such as capacitor aging.

so... what do you think?


bigfellla;1847371 Also said:
increasing stock speeds increases watt consumption, does it not? That's basically what all the benchmarks and such indicate.
 
Last edited:
Yes increasing clocks will increase consumption. You are over thinking this IMHO. All you need to do is ensure that you have adequate 12V rail amperage from a quality PSU. If you have more than required there is no safety concern, and given the relative costs between say a 500W and a 600W PSU for example, its not worth losing hair over it.

You can also review systems that have been monitored using wattmeters at the wall. This will give an indication of overall consumption which, most occuring on the 12V rail, would be a conservative use-case. That is, if it drags 450W from the wall with an 80% efficient PSU, the total system is using about 360W. So you need to account for the losses, therefore I would suggest a 550W to be safe (many other factors such as summer temperatures, capacitor derating etc). That means in the above example I would choose a quality PSU that has 45A amps on the 12V rail. Sounds excessive, but its really good sense.
 
Back
Top