Help needed about CPU vs SATA and USB 3.0 speed

foxclab01

New Member
Hello to everybody in the forum.

Here is a question I have and I cannot answer it. It is a theoritical and not a practical question, but it still bothers me :

Why is it possible for ( any ) SATA and for USB 3.0 to function? I think that these protocols …should just not be able to work!

Let me explain myself :

Let’s take a really fast CPU, one that is clocked in 3,6GHz. The CPU needs from 1 up to 4 clock cycles to form a command in machine language, right? So, in the worst case scenario, this CPU can execute 3,6 / 4 = 0,85 billion commands per second or, if you prefer, 850 millions commands per second.

So far, so good. But let’s see now the USB 3.0 protocol. This new protocol can transfer 5Gbits per second. Since one CPU command is needed for every bit transfer, we need 5 billion commands per second to be handled by the CPU, in order to make USB 3.0 to work. This is just not possible. In fact, nothing that needs more than 850 million commands per second should work.

I understand that PCI express protocol is another story, because there the bus is handled by hundreds of small processors inside the graphics card and these processors work in parallel, handling the PCI express lines all together.

But what about SATA and USB 3.0? Why on earth are they able to work?? …..I must have a mistake in my calculations, but I do not know where…. 

Can anybody help?
 
You have to understand that all data no longer has to go through the processor.

The clock/data speeds you listed for SATA and USB 3.0 are the maximum data transfer rates. Burst mode, going directly from say hard drive buffer to RAM.

Your computer will in all reality rarely attain data transfer speeds that high, if ever.

So, basically the processor issues the command to a device to transfer X-amount of data starting at location A to be received by another device. The processor goes on its merry way while the two devices move data at a rate faster than the processor can handle (because it doesn't have to), and when the processor returns its attention to those devices, they will either will still be busy moving data, or they will be done and awaiting the next order.

Does this help you?
 
Thank you for your very helpful answer. I think I get the picture now.

Searching in the web, I found that real-life tranfer rate for the USB 3.0 is about 100 megabytes / sec, or 800 megabits / sec if you prefer ( far lower than the theoritical maximum of 4,8 gigabits of course ). But these real life tests had to do with data transfer that included hard disk, and of course there were the limitations of the disks rpm etc.

So, does anybody know or have tested how fast is the real transfer rate of USB 3.0 when hard disks are not involved? ( for example, accessing the USB port for serial data transfer through a software program ). I do not expect 4,8 gbits / sec of course, but not as slow as 800 mbits / sec neither. Is my assumption correct, or in the case of the software program, because now the CPU DOES interfere, there is the limitation of the CPU clock?

Thank you all again.
 
Back
Top