If you pick up any computer book or dictionary and look up the term Gigabyte, you will find it defined as “1 gigabyte is equal 1024 megabytes”. Similarly, 1 Megabyte is defined as being equal to 1024 kilobytes, and 1 kilobytes as equal to 1024 bytes. That’s what we have been taught and that’s what we believe to be true. But that definition of gigabyte, megabyte and kilobyte has changed nearly 8 years ago.
Traditionally, one gigabyte has been defined as 10233 bytes or 1,073,741,824 bytes or 230 bytes. This is the definition commonly used for computer memory and file sizes. Then in December 1998 the International Electrotechnical Commission (IEC), the leading international organization for worldwide standardization in electrotechnology, introduced new symbols and prefixes for binary multiples and changed the earlier ones. According to the new definitions, one gigabyte no longer equals 10233 bytes but 10003 bytes and 10233 bytes is now represented by a new term called gibibyte. The new prefixes for measurement of bytes are shown in the table below.
That explains one common anomaly in size measurement that we notice every day in our lives, that is, in the measurement of hard disk size. You go and buy a 160GB hard disk, but when you plug in to your computer and turn it on you find your operating system reports only 149GB. That’s because hard disk manufacturers no longer use the old convention of measuring sizes but the brand new ones. When a hard disk is labeled 160GB it has a capacity of 160 x 10003 bytes and not 160 x 10243 bytes, because like I explained, 1GB is not equal 10243 bytes. So a 160GB hard disk has a capacity of 160 x 10003/10243 bytes or 149 GiB (gibibytes).
The IEC binary naming convention however, is not widespread and most publications, computer manufacturers and software companies prefer to use the traditional units. For instance, the memory (RAM) manufacturers continue to use the old naming convention. So when you buy 1GB of memory you get 10243 bytes of RAM. There is however a good explanation for this discrepancy. Computer memory is addressed in base 2, due to its design, so memory size is always a power of two. It is thus convenient to work in binary units for RAM. Other computer measurements, like storage hardware size, data transfer rates, clock speeds, operations per second, etc., do not have an inherent base, and are usually presented in decimal units.
To add to the confusion, different softwares as well as hardware manufacturers use different unit of measurement. Examples of software that use IEC standard prefixes (along with standard SI prefixes) include the Linux kernel, GNU Core Utilities, Launchpad, GParted, ifconfig, Deluge (BitTorrent client), and BitTornado. Other programs like fdisk and apt-get use SI prefixes with their decimal meaning.
Screenshots of Gnome Partition Editor and Windows Disk Management show how different software utilities use different measurement for hard disk capacities
Floppy disks uses the binary system of measurement. A 1.44MB floppy disk has a capacity of 144,000,000 bytes which is equal to 1.38 MiB (mebibytes); notice that the operating system reports this as 1.38MB.
CD capacities are always given in binary units. A “700 MB” (or “80 minute”) CD has a nominal capacity of about 700 MiB (approx 730MB). But DVD capacities are given in decimal units. A “4.7 GB” DVD has a nominal capacity of about 4.38 GiB.
Network speeds use the binary units of measurement. A 1Mbps internet connection has a throughput of 1,000,000 bits (125 kB, approx 122 KiB) per second assuming an 8-bit byte, and no overhead.
All these ambiguity in measurements have led to consumer confusion and there actually have been two significant class action lawsuits against digital storage manufactures by consumers. One case involved flash memory and the other involved hard disk drives. Both were settled with the manufactures agreeing to clarify the storage capacity of their products on the consumer packaging. But most hard disk manufacturers still continue to use decimal prefixes to identify capacities, with no mention of capacities in terms of gibibytes.
Prefixes for binary multiples
Technically there are still 1024 MB in a GB. The Hex base system has not changed. The real reason why you see that 1000 MB is equal to 1 GB is that ISPs and telephony companies have been using it for years to simplify their marketing to end users. It also means that they can cap data usage and save themselves a little money.
But, realistically, given that the IEC can’t actually change the structure of binary computing (which has remained the same for decades now), it’s just a way to simplify things for people and cut out the lawsuits.
It has led to more confusion actually because there is no uniformity in measurement. For eg, my ISP measures 1GB as 1024MB, whereas technically in data communication, 1GB should be 1000MB.
I agree with Steve…. Considering binary computing, 1GB = 1024MB….
Huh… Will have to digg some more and find out for real.
I really can’t understand how these people have changed something completely tied to the hardware. I couldn’t imagine some people in the cloth business changing the way we consider a meter or an inch… This is all about confusing the customer and make them believe that they bought a 500 Gb hard drive when it’s only 488…
It’s easy to understand. Calculating a hard drive’s capacity in base 10 makes it larger than if you calculated it in base 2.
Which would you expect Joe Sixpack to buy when they go into a bestbuy or wal-mart looking for another hard drive, the 500GB or the 488GB for the same price?
kaushik, I think you have that backwards. ISPs (in general) use the SI prefices like the HD makers, as powers of ten. In a non-ISP setting, 1 kilobits will almost always mean 1024 bits.
RAM makers sell their chips (the actual chip form, not what you put in your RAM slot) by the Mb or Gb, using powers of two. When you get a RAM stick, it will be in MB or GB, using powers of two.
While HD makers sell in sizes of of powers of 10, what do they think is more correct? Well, Western Digital recently settled a class action lawsuit about that issue, essentially admitting that what they claim is “500GB” is not really 500GB.
According to SI, IEC is full of it and “MB” means millions of bels, not bytes. Just because an international standards organization says so, doesn’t make it so… (de jure standard vs. de facto standard)
The problem is that the “solution” proposed by that group (and advocated on wikipedia, which is NOT a reliable source) creates (as mentioned before) MORE confusion. The definition has been simple since the early days of computing. 1 KB = 1,024 Bytes (etc.)
There are real and necessary reasons why this is so. The IEC clearly do not understand why this is true, and are doing everyone a disservice by attempting to REDEFINE an existing, standardized, set of abbreviations. A better solution would have been for the IEC (who only want to keep the abbreviations K, M, G … consistent) to use the new abbreviations they made up (KiB etc) for the new definitions they are advocating (where GB means 1,024 MB, just like it has for 30 years, and where GiB means 1,000,000,000 Bytes). Please note also that “SI prefixes” and “Binary prefixes” are terms made up by web authors and are not from the IEC proposal.
You should avoid linking directly to W*k*ped*a because “they” use results from Google to argue against the new binary prefixes claiming that few results mean low adoption in “The Real World (TM)”. The Google queries are defined to exclude any results with a mention of the word w*k*ped*a in them. Of course everyone loves to link to it, it’s almost as “natural” as using Google despite other options, but that gives them an easy way to argue that nobody knew these prefixes.
What’s also funny, that nobody knows what the majority of laymen actually think a kilobyte or megabyte is. The non-experts I know, if they were aware of the unusual definition still calculated with the decimal figures! So to a laymen a Megabyte is still roughly 1000000 byte which means the other definition is not as widely known or used as assembler coders like to claim.
“But that definition of gigabyte, megabyte and kilobyte has changed nearly 8 years ago.”
No it didn’t. It was defined in BOTH ways previously. Hard drives have always been measured in 1000s, and memory has always been measured in 1024s. Other things like floppy drives were measured in every way you can imagine (like the “1.44 MB” floppy).
What happened 8 years ago was SI said “enough of this nonsense! It means 1000!” and all the other standards agreed. So now we have SI for 1000 and IEC for 1024, and the sooner people accept this, the better.
“Well, Western Digital recently settled a class action lawsuit about that issue, essentially admitting that what they claim is “500GB” is not really 500GB.”
That’s a load of crap. Western Digital placed the blame right where it belongs: Microsoft.
In describing its HDD’s, Western Digital uses the term properly. Western Digital cannot be expected to reform the software industry. Apparently, Plaintiff believes that he could sue an egg company for fraud for labeling a carton of 12 eggs a “dozen,” because some bakers would view a “dozen” as including 13 items.
1. I always thought a “1.44MB” floppy disk had 1440 x 1024 bytes = 1,474,560 bytes or 1.41MiB.
2. In communications, kilo and mega mean exactly 1,000 and 1,000,000 when combined with “per second” – the basic digital signalling rate is 64kps, which is derived from an 8kHz sample rate and 8-bit coding – 8,000 x 8 = 64,000.
(Confusingly though, an E1 channel combines 32 of these to give 2,048,000 bits per second and is colloquially known as a “2 meg”)…
As long as i have lived (i’m 15) there have been 1024 bytes in a KB, 1024 KB in a MB and 1024 MB in a GB. Always. Full stop.
I think that computers stick mainly to conformity, and only use GiB and MiB’s and call them MB and GB, as far as the average Joe is concerned. The only times that this issue has ever really effected me is when buying a hard drive, and still doesn’t make a lot of difference, because what’s 12 gigs when you’ve got 500?
I think that more confusion comes (for the average user) when talking about bit rates and network line speeds, etc. People need to be educated that 1MB is not 1Mb, and that there are 8 bits in a byte.
My point proven, when I wrote 1Mb above, it came up as a spelling error, whereas 1MB did not.
It’s all Microsoft’s fault.
1GB IS EQUAL TO 1024MB
Troy 🙂 (i love u matt, hapi bday :))
this one na confussion o na dey cause so.
Jesus, why make such a change in the first place, but to not make it more public seems an even greater folly.
Just sell us 466GiB hard drives for the same price. Market them as "High Density^H^H^H^H^Hfinition" or something. They'll go great with our ""true"" HD monitors. Only then can we put the IEC's mistakes behind us.
It doesn't matter what you call it. 2 to the power of 10 is still 1024.