*** Welcome to piglix ***

Megabytes


The megabyte is a multiple of the unit byte for digital information. Its recommended unit symbol is MB, but sometimes MByte is used. The unit prefix mega is a multiplier of 1000000 (106) in the International System of Units (SI). Therefore, one megabyte is one million bytes of information. This definition has been incorporated into the International System of Quantities.

However, in the computer and information technology fields, several other definitions are used that arose for historical reasons of convenience. A common usage has been to designate one megabyte as 1048576bytes (220 B), a measurement that conveniently expresses the binary multiples inherent in digital computer memory architectures. However, most standards bodies have deprecated this usage in favor of a set of binary prefixes, in which this quantity is designated by the unit mebibyte (MiB). Less common is a convention that used the megabyte to mean 1000×1024 (1024000) bytes.

The megabyte is commonly used to measure either 10002 bytes or 10242 bytes. The interpretation of using base 1024 originated as a compromise technical jargon for the byte multiples that needed to be expressed by the powers of 2 but lacked a convenient name. As 1024 (210) approximates 1000 (103), roughly corresponding to the SI prefix kilo-, it was a convenient term to denote the binary multiple. In 1998 the International Electrotechnical Commission (IEC) proposed standards for binary prefixes requiring the use of megabyte to strictly denote 10002 bytes and mebibyte to denote 10242 bytes. By the end of 2009, the IEC Standard had been adopted by the IEEE, EU, ISO and NIST. Nevertheless, the term megabyte continues to be widely used with different meanings:


...
Wikipedia

...