Byte: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Paul Derry
(→‎Related topics: Added and adjusted a few things. EBCDIC and ASCII)
imported>Joshua David Williams
(→‎Table of prefixes: added commas for readability)
Line 52: Line 52:
|kilobyte (KB) || 10<sup>3</sup> || kibibyte (KiB) || 2<sup>10</sup> || 2.4% || 24
|kilobyte (KB) || 10<sup>3</sup> || kibibyte (KiB) || 2<sup>10</sup> || 2.4% || 24
|----
|----
|megabyte (MB) || 10<sup>6</sup> || mebibyte (MiB) || 2<sup>20</sup> || 4.9% || 48576
|megabyte (MB) || 10<sup>6</sup> || mebibyte (MiB) || 2<sup>20</sup> || 4.9% || 48,576
|----
|----
|gigabyte (GB) || 10<sup>9</sup> || gibibyte (GiB) || 2<sup>30</sup> || 7.4% || 73741824
|gigabyte (GB) || 10<sup>9</sup> || gibibyte (GiB) || 2<sup>30</sup> || 7.4% || 73,741,824
|----
|----
|terabyte (TB) || 10<sup>12</sup> || tebibyte (TiB) || 2<sup>40</sup> || 10% || 99511627776
|terabyte (TB) || 10<sup>12</sup> || tebibyte (TiB) || 2<sup>40</sup> || 10% || 99,511,627,776
|----
|----
|petabyte (PB) || 10<sup>15</sup> || pebibyte (PiB) || 2<sup>50</sup> || 12.6% || 125899906842624
|petabyte (PB) || 10<sup>15</sup> || pebibyte (PiB) || 2<sup>50</sup> || 12.6% || 125,899,906,842,624
|----
|----
|exabyte (EB) || 10<sup>18</sup> || exbibyte (EiB) || 2<sup>60</sup> || 15.3% || 152921504606846976
|exabyte (EB) || 10<sup>18</sup> || exbibyte (EiB) || 2<sup>60</sup> || 15.3% || 152,921,504,606,846,976
|----
|----
|zettabyte (ZB) || 10<sup>21</sup> || zebibyte (ZiB) || 2<sup>70</sup> || 18.1% || 180591620717411303424
|zettabyte (ZB) || 10<sup>21</sup> || zebibyte (ZiB) || 2<sup>70</sup> || 18.1% || 180,591,620,717,411,303,424
|----
|----
|yottabyte (YB) || 10<sup>24</sup> || yobibyte (YiB) || 2<sup>80</sup> || 20.9% || 2.08925819614629174706e23
|yottabyte (YB) || 10<sup>24</sup> || yobibyte (YiB) || 2<sup>80</sup> || 20.9% || 2.08925819614629174706e23

Revision as of 15:43, 13 April 2007

The Hexer hex editor displaying the Linux kernel version 2.6.20.6; this image illustrates the value of bytes composing a program as they appear in the hexadecimal format

In computer science, a byte is a unit of data consisting of eight bits. When grouped together, bytes can contain the information to form a document, such as a photograph or a book. All data represented on a computer are composed of bytes, from e-mails and pictures, to programs and data stored on a hard drive. Although it may appear to be a simple concept at first glance, the actual definition is far more complex and deeper.

Technical definition

For example, in electronics, information is represented by the toggle of two states, usually referred to as 'on' and 'off'. To represent this state, computer scientists use the values of 0 (off) and 1 (on); we refer to this value as a bit.

Each byte is made of eight bits, which can represent any number from 0 to 255. We obtain this number of possible values, which is 256 when including the 0, by raising the possible values of a bit (two) to the power of the length of a byte (eight); thus, 28 = 256 possible values in a byte.

Bytes can be used to represent a countless array of data types, from characters in a string of text, to the assembled and linked machine code of a binary executable file. Every file, sector of system memory, and network stream is composed of bytes.

Perhaps the oldest formation of bytes was plain text, that is, plain alphanumeric characters with no punctuation. To make up for the absence of basic punctuation, telegrams would often use the word "STOP" to represent a period. The actual value of each character has varied in years past. Today, however, we have the American Standard Code for Information Interchange (ASCII), which allows data to be readable when being transmitted through different mediums, such as from one operating system to another. For instance, a user who typed a plain text document in Linux will be able to read his file correctly on a Macintosh computer. One example of ASCII would be the capital letters of the English language, which range from 101 for "A" to 127 for "Z".

Endianness

For more information, see: Endianness.

Of course, since data almost always consist of more than one byte, these strings of numbers must be arranged in a certain fashion in order for a device to read it correctly. In computer science, we refer to this as endianness. Just as some human languages are written from left to right, such as English, while others are written from right to left, such as Hebrew, bytes are not always arranged in the same fashion.

Suppose we are writing a program that uses the number 1024. In this example, the number 1 is considered to be the most significant byte. If this byte is written first, that is, in the lowest memory sector, then we are using the 'Big Endian'. If this byte is written last, or in the highest memory sector, rather, then we are using the 'Little Endian'. This is typically not a problem when dealing with the local system memory since the endianness is determined by the processor's architecture. However, this can pose a problem in some instances, such as network streams. For this reason, a networking device must specify which format it is using before it sends any data. This ensures that the information is read correctly at the receiving end.

Word origin and ambiguity

Although the origin of the word 'byte' is unknown, it is believed to have been coined by Dr. Werner Buchholz of IBM in 1964. It is a play on the word 'bit', and originally referred to the number of bits used to represent a character.[1] This number is usually eight, but in some cases (especially in times past), it can be any number ranging from as few as 2 to as many as 128 bits. Thus, the word 'byte' is actually an ambiguous term. For this reason, an eight bit byte is sometimes referred to as an 'octet'.[2]

Sub-units

For more information, see: SI prefix.

While basic, byte is not the most commonly used unit of data. Because files are normally many thousands or even billions of times larger than a byte, other terms designating larger byte quantities are used to increase readability. Metric prefixes are added to the word byte, such as kilo for one thousand bytes (kilobyte), mega for one million (megabyte), giga for one billion (gigabyte), and even tera, which is one trillion (terabyte). One thousand megabytes compose a terabyte, and even the largest consumer hard drives today are only three-fourths a terabyte (750 'gigs' or gigabytes). The rapid pace of technological advancement may make the terabyte commonplace in the future, however.

Conflicting definitions

For more information, see: Binary prefix.

Traditionally, the computer world has used a value of 1024 instead of 1000 when referring to a kilobyte. This was done because programmers needed a number compatible with the base of 2, and 1024 is equal to 2 to the 10th power. This practice however, is now non-standard. It has recently been replaced with the term 'kibibyte', abbreviated as KiB. This standard is known as the 'binary prefix'.

While the difference between 1000 and 1024 may seem trivial, one must note that as the size of a disk increases, so does the margin of error. The difference between 1TB and 1TiB, for instance, is approximately 10%. As hard drives become larger, the need for a distinction between these two prefixes will grow. This has been a problem for hard disk drive manufacturers in particular. For example, one well known disk manufacturer, Western Digital, has recently been taken to court for their use of the base of 10 when labeling the capacity of their drives. This is a problem because labeling a hard drive's capacity with the base of 10 implies a greater storage capacity when the consumer may assume it refers to the base of 2. [3]

Table of prefixes

SI prefixes (abbreviation) Value Binary prefixes (abbreviation) Value Difference* Difference in bytes
kilobyte (KB) 103 kibibyte (KiB) 210 2.4% 24
megabyte (MB) 106 mebibyte (MiB) 220 4.9% 48,576
gigabyte (GB) 109 gibibyte (GiB) 230 7.4% 73,741,824
terabyte (TB) 1012 tebibyte (TiB) 240 10% 99,511,627,776
petabyte (PB) 1015 pebibyte (PiB) 250 12.6% 125,899,906,842,624
exabyte (EB) 1018 exbibyte (EiB) 260 15.3% 152,921,504,606,846,976
zettabyte (ZB) 1021 zebibyte (ZiB) 270 18.1% 180,591,620,717,411,303,424
yottabyte (YB) 1024 yobibyte (YiB) 280 20.9% 2.08925819614629174706e23

*Increase, rounded to the nearest tenth

Related topics

References

  1. Dave Wilton (2006-04-8). Wordorigins.org; bit/byte.
  2. Bob Bemer (Accessed April 12th, 2007). Origins of the Term "BYTE".
  3. Nate Mook (2006-06-28). Western Digital Settles Capacity Suit.