brycematheson712
New Member
This might be hard to explain and if you all don't get it, please just bare with me. This is something that I've always wondered but never really asked.
Okay, so as an example: So let's say that I've got a 32-bit single core CPU clocked in at 3.2 Ghz. But then I go notice a 64-bit Dual Core CPU that's clocked in running at 2.6 Ghz. How does that work? Shouldn't the Dual Core be clocked higher than the Single 32-bit core? This is the only thing I can think of:
So if the Dual-Core is clocked at 2.6 Ghz, since it's dual, you would double that speed, correct? So now, that's 2.6 X 2.6 = 5.2 Ghz. Now, since it is 64-bit verses 32-bit you would also double that because it's twice the bandwidth of the 32-bit correct? So now we're up to 10.4 Ghz. So a 2.6 Ghz 64-bit Dual Core is essentially 10.4 Ghz when compared to a 32-bit single cored CPU clocked at 3.2 Ghz?
Does that all make sense? Maybe I'm just confusing myself. Can somebody explain it to me? Thanks!
Okay, so as an example: So let's say that I've got a 32-bit single core CPU clocked in at 3.2 Ghz. But then I go notice a 64-bit Dual Core CPU that's clocked in running at 2.6 Ghz. How does that work? Shouldn't the Dual Core be clocked higher than the Single 32-bit core? This is the only thing I can think of:
So if the Dual-Core is clocked at 2.6 Ghz, since it's dual, you would double that speed, correct? So now, that's 2.6 X 2.6 = 5.2 Ghz. Now, since it is 64-bit verses 32-bit you would also double that because it's twice the bandwidth of the 32-bit correct? So now we're up to 10.4 Ghz. So a 2.6 Ghz 64-bit Dual Core is essentially 10.4 Ghz when compared to a 32-bit single cored CPU clocked at 3.2 Ghz?
Does that all make sense? Maybe I'm just confusing myself. Can somebody explain it to me? Thanks!