Stormblood said:
...@Stormblood: Small mistake. Let me fix the mistake particularly cause I myself am sort of a Grammer Nazi and I'm punishing myself for the mistake...
So to fix my mistake: 1 billion bits is = to 1 Gigabit.
1 Gigabit or 1 billion bits or 1,000,000,000 is the primary speed of Giganet IF it reaches it near. While it's actually 2^ 30 or = to 1,073,741,824 Gibibits. The remaining 73,741,824 bits or about 74Mebibits are
usually used for overheard of bandwidth. In the end it's lucky you even reach the full usage of bandwidth as it's usually a synthetic benchmark, not that you DON'T actually get some good speeds just sometimes the infrastructure doesn't reach it. Some people have giganet internet at home and reach 700, 800, 900, maybe even slightly above 1Gig if the bandwidth overhead isn't bad.
Anyways...
1,000,000,000 divided by 8 is 125,000,000 bytes or 125Megabytes. Again if using the full Gibibit range 1,074,741,824 it would = 134,217,728 bytes or 134Mebibytes. The remaining 9Mebibytes are
usually used for bandwidth overhead or 9,217,728bytes.
So there thank you Stormblood for clearing up a mistake. I didn't see it at first but I noticed it after re-reading two or three times the post.
But again the question becomes again "Why isn't 4G capable of such things surely we haven't maxed out the tech?"