bit and Bytes 101

lostsoul62

Member
Everyone knows that there are 8 bits to a Byte. It's been 30 years since I went to school for this so here it goes. A modem like dial up there is 10 bits to every Byte sent and I guess it is that way sending data over the Internet now a days. On a hard drive you have to have a bit between every Byte so I'm thinking are is a total of 10 bits for every Byte or did they change something in the last 30 years?
 
Some people just don't get it, and correct me if I'm wrong, but isn't it 8 bits = 1 byte? not 10? :o
 
All i know is dont be fooled by cable internet, if they say 1mb service they mean 126kb/sec downloading. 4mb service around 504kb/sec. Always get the fastest you can afford.
 
Last edited:
All i know is dont be fooled my cable internet, if they say 1gb service they mean 126kb/sec downloading. 4gb service around 504kb/sec. Always get the fastest you can afford.

1Gb = 128MB/s. I think you meant 1Mb service.

No they don't try to fool you either, they tell the absolute truth. I pay for 50Mb/s, and I get 50Mb/s. The important bit (lol) is that b. A MB is different to a Mb. MB = Megabyte, Mb = Megabit.

To convert from a Gigabit to a Gigabyte, divide by 8 (8 bits in a byte), so my max download speed is actually 6.25MB/s.

To OP, there are 8 bits to a byte, not 10, you can't change that, it is fundamental. Computers don't, and can't work in 10's. Because they work in binary, on and off, power or no power, they have to work in powers of 2.

1
2
4
8
16
32
64
128
256
512
1024
2048

etc

You can combine them together, hence why you can get video cards with 768MB memory, why hard drives consider 1000MB a GB rather than 1024, which is a true GB
 
Techie

I know this is a techie question and only about one and ten might know it. As I tried to say 8 bits = 1 Byte and everyone knows it. Everyone knows that there 1 kilobit = 1024 bits. When you send data anywhere you will be sending 9+ bits for every Byte and I guarantee that. Are we on the same page so re read my original post?
 
I see what you're trying to say. In networking, it is more realistic to go 1/10 instead of 1/8 to estimate true bandwidth due to overheads and loss, etc. In terms of storage it won't work like that because it's storage, not networking.
 
i pay for 1 mb(actually megabits but seeing mb one could be fooled thing it was megabytes) and i get 126kb/sec, i wish it was 1mb/sec
 
I know this is a techie question and only about one and ten might know it. As I tried to say 8 bits = 1 Byte and everyone knows it. Everyone knows that there 1 kilobit = 1024 bits. When you send data anywhere you will be sending 9+ bits for every Byte and I guarantee that. Are we on the same page so re read my original post?

It would depend on the transfer method. Some will use 8, for instance parity will still only use 8, the network or hardware (more likely hardware) will know, say you are using odd parity, 10011010 has an even number of bits therefore must have been sent incorrectly and will send a request for the packet to be resent

@cabinfever Not to go all nostalgic, but I remember upgrading to 64 bit and (this was insane at the time) 128 bit. I mean I was so damn excited at getting 16 kb/s speeds. In the space of what, 10 years or so now I'm on 500 times that speed, it is just wow. Only 3 years ago or so I was on the 1Mb/s you are on about (actually just over, was 10mb/s) and that was the biggest shock to the system to finally see MB on the downloads
 
Last edited:
Back
Top