TDP is thermal power disapation, not load electrical.
However they're good indicators.
This is what I have always been getting with other peoples comments and benchmarks, but my problem is that I never really had anyone straight up confirm it.
"TDP, representing the thermal design
power, is a rudimentary indicator of consumption, and may leave
customers overestimating their data center infrastructure needs."
^
this is exactly what I want. I want something close to actual consumption at load but I also want to overestimate for safety reasons.
basically, it appears there is a pattern with watt TDP and actual consumption. They aren't going to give a GPU a TDP of 170w when actual consumption at load is 50% higher or lower than 170w, they are going to give it a TDP of 170w because actual consumption (at load without overclocking) is ALWAYS going to be lower than 170w by perhaps 20-40% or so.
For example, this is useful for when a beginner wants to only upgrade their GPU but they don't know if their PSU has a high enough watt limit (or more specifically, the amperage of it's 12v rail) to handle a much better graphics card. To answer this, we can use the watt TDP of a GPU as a "rough estimation" of the absolute max amount of watts it "could" consumed at stock speeds and add it to our estimated consumption for the rest of the system at load... now we have a number we can compare to his PSU 12v rail's watt limit. Utimately, we want to recommend one of the best graphic cards his PSU can handle (if possible) but we also want to overestimate max watt consumption at stock speeds to leave room for certain things such as capacitor aging.
so... what do you think?
bigfellla;1847371
Also said:
increasing stock speeds increases watt consumption, does it not? That's basically what all the benchmarks and such indicate.