Has anyone seen any technical difference between the 3700X and 3800X?
A difference of 40W TDP can't surely be down to binning and a few hundres MHz. Granted TDP is rather useless as a measurement, but the 12core is also 105W with a higher boost and almost same base.
This CPU has a TDP of 105W, which for AMD processors is usually a good measure of all-core power consumption
You'll notice that the 6 core 3600X has a higher TDP rating (95W) than the 8 core 3700X (65W). As Ian mentioned in his Anandtech article:
I think the variations in TDP are absolutely related to binning.
TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure. I've used HWInfo to measure actual power draw on the CPU socket (although unsure how accurate it is) and I'll see my actual consumption usually under 65 watts when at stock clocks but at 4.0GHz I see it north of 65 regularly and over 110ish at full load. I remember my 8320 would sometimes pull over 200 watts when I had that bad boy cranked all the way up.TDP has been rather useless since 2006, but exactly as you also point out, a six-core at 3.8/4.4GHz is supposedly pretty much the same TDP as a 12-core with 3.8/4.6Ghz
I hate TDP with a passion, and wish we could make something like their ACP an industry standard and required. Power consumption based on a specific set of benchmarks.
TDP has been rather useless since 2006, but exactly as you also point out, a six-core at 3.8/4.4GHz is supposedly pretty much the same TDP as a 12-core with 3.8/4.6Ghz
I hate TDP with a passion, and wish we could make something like their ACP an industry standard and required. Power consumption based on a specific set of benchmarks.
TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure. I've used HWInfo to measure actual power draw on the CPU socket (although unsure how accurate it is) and I'll see my actual consumption usually under 65 watts when at stock clocks but at 4.0GHz I see it north of 65 regularly and over 110ish at full load. I remember my 8320 would sometimes pull over 200 watts when I had that bad boy cranked all the way up.
Again not sure how accurate those measures are but just shows that TDP is more like a general guideline than an actual rule/measurement.
I'm hoping we see more laptops this year too.
Ian Cutress - Anandtech said:But TDP, in its strictest sense, relates to the ability of the cooler to dissipate heat. TDP is the minimum capacity of the CPU cooler required to get that guaranteed level of performance. Some energy dissipation also occurs through the socket and motherboard, which means that technically the cooler rating can be lower than the TDP, but in most circles TDP and power consumption are used to mean the same thing: how much power a CPU draws under load.
The value of TDP, or thermal design power, is not a measure of power consumption. It is technically a measure of cooler performance, and a cooler needs to be rated at the TDP level in order to perform regular functions. Actual power consumption should technically be higher – thermal losses from the processor into the socket and from the socket into the motherboard also contribute to cooling, but are not involved in the TDP number. However, for most use cases, TDP and power consumption are used interchangeably, as their differences are minor.
Over the last decade, while the use of the term TDP has not changed much, the way that its processors use a power budget has. The recent advent of six-core and eight-core consumer processors going north of 4.0 GHz means that we are seeing processors, with a heavy workload, go beyond that TDP value. In the past, we would see quad-core processors have a rating of 95W but only use 50W, even at full load with turbo applied. As we add on the cores, without changing the TDP on the box, something has to give.
For the last however many years, this is the definition of TDP that Intel has used. For any given processor, Intel will guarantee both a rated frequency to run at (known as the base frequency) for a given power, which is the rated TDP. This means that a processor like the 65W Core i7-8700, which has a base frequency of 3.2 GHz and a turbo of 4.7 GHz, is only guaranteed to be at or below 65W when the processor is running at 3.2 GHz. Intel does not guarantee any level of performance above this 3.2 GHz / 65W value.
TDP is Thermal Design Power which I understood to mean how much heat it will put off under full load measured in watts, and not an actual power consumption figure.
You guys should probably read up on how TDP is established...
before making statements.
so you can't for example have a cpu using 65 watts of power generating 100 watts of heat.
Ian - Anandtech said:In the past, we would see quad-core processors have a rating of 95W but only use 50W, even at full load with turbo applied.
Stop getting your panties in a twist. My post wasn't directed at you.You mean the articles I've already directly quoted from?
Wow ha. The pot calling the kettle black on that one. Correcting misinformation is one thing, but just copying and pasting links and then behaving hypocritically by stating "before making statements" does nothing but cause bitter dissension.
Yes, but
You guys should probably read up on how TDP is established... before making statements
Stop getting your panties in a twist. My post wasn't directed at you.
it's fine if they want to overestimate a tdp and build a cooling solution for that rating , which would be good practise as no two cpu's are the same so this would allow a margin of error, but it becomes a problem if a tdp is underestimate and then not enough cooling is provided.
Beat me to it.
Been saying it for years. Still haven't, at this point I almost just don't want to out of principle knowing how much I could have made a couple years ago pre Zen.Yeah I think you're right and I think it's time to buy some shares ha. The price has jumped substantially since last year.