bit (EN)

GC: n

CT: A bit (short for binary digit) is the smallest unit of data in a computer. A bit has a single binary value, either 0 or 1. Although computers usually provide instructions that can test and manipulate bits, they generally are designed to store data and execute instructions in bit multiples called bytes. In most computer systems, there are eight bits in a byte. The value of a bit is usually stored as either above or below a designated level of electrical charge in a single capacitor within a memory device.
Half a byte (four bits) is called a nibble. In some systems, the term octet is used for an eight-bit unit instead of byte. In many systems, four eight-bit bytes or octets form a 32-bit word. In such systems, instruction lengths are sometimes expressed as full-word (32 bits in length) or half-word (16 bits in length).
In telecommunication, the bit rate is the number of bits that are transmitted in a given time period, usually a second.

S: http://whatis.techtarget.com/definition/bit-binary-digit(external link) (last access: 30 April 2016)

N: 1. computerese word, 1948 abbreviation coined by U.S. computer pioneer John W. Tukey (1915-2000) of binary digit, probably chosen for its identity with bit ("small piece," c. 1200; related Old English bite "act of biting," and bita "piece bitten off," probably are the source of the modern words meaning "boring-piece of a drill" (1590s), "mouthpiece of a horse's bridle" (mid-14c.), and "a piece bitten off, morsel" (c. 1000).).
2. Bit, in communication and information theory, a unit of information equivalent to the result of a choice between only two possible alternatives, as between 1 and 0 in the binary number system generally used in digital computers. The term is shortened from the words “binary digit.” It is also applied to a unit of computer memory corresponding to the ability to store the result of a choice between two alternatives.
3. Connection speeds and data sizes are measured differently, but people tend refer to them with the same names. People often say "megs" and forget that the word "meg" refers to two very different values. Do they mean megabits or megabytes? Aren't they the same?
Actually no, there's a big difference between a bit and a byte. A byte is much bigger: eight times bigger, in fact, with eight bits in every byte. By extension, there are eight megabits in every megabyte and 1 gigabyte is 8 times bigger than 1 gigabit.
4. Cultural Interrelation: Have you ever wondered why a United States quarter-dollar is called “two-bits”? Or, a half-dollar “four-bits”? Do you know why we call our basic monetary unit “dollar” instead of something else?
Two-bits, four-bits, six-bits and eight-bits make reference to the eight-reales silver coin of New Spain and Mexico. It is also called piece of eight and circulated in the English Colonies and freely in the USA following the Revolutionary War. As a matter of fact, the eight-reales coin was legal tender in the United States until 1857 and was the world’s most used coin at one time. It is the renowned piece of eight that became part of the Spanish Main pirate lore.

S: 1. OED - http://www.etymonline.com/index.php?term=bit(external link) (last access: 30 April 2016). 2. EncBrit - http://global.britannica.com/technology/bit-communications(external link) (last access: 30 April 2016). 3. http://www.uswitch.com/broadband/guides/bits_and_bytes_explained/(external link) (last access: 30 April 2016). 4. http://www.coinlink.com/News/world-coins/history-of-coins-two-bits-four-bits-six-bits-eight/(external link) (last access: 30 April 2016).


CR: byte (EN), computer science.


Search box:
If not found, you will be given a chance to create it. Help     Admin

Switch Language