A bit
is actually an acronym which stands for 'Binary Digit'. It is the smallest
possible unit of information in digital computing. Computers do not use decimal
numbers to store data. In computers all data is stored in binary numbers. They
are all based on binary digital logic. Every bit can take only two characteristic
values either 0 or 1. For computers and in digital communication, a bit is the
smallest amount of information stored in binary form. In digital
telecommunication too, all the voltage levels are converted into binary form of
data or bits.
John Tukey |
The
origin of the term 'binary digit' or 'bit' is attributed to John Tukey, a
scientist who worked at the Bell Laboratories who first used it in 1947. Since,
then the term has been in use in the world of computers. A byte is a string of
8 bits put together. A byte is therefore a bigger unit of information than a
bit.
Dr. Werner Buccholz |
The term 'byte' was first used and coined by Dr. Werner Buccholz, a
computer scientist working at IBM in 1956.
Just as, including zero, decimal number system is based on ten numbers, the binary number system has just two numbers 0 and 1. All the data that a computer processes is in the form of 0s and 1s. These bits are represented by dual voltage levels in digital communication.
Computer
converts all data into bits and bytes through alphanumeric and decimal to
binary conversion. So for the computer, alphabets and letters are all
represented in bits. That is, bit is a letter of the computer language, while a
byte is a word (made up of 8 letters)! So speaking the machine language or
digital language, is speaking in bits and bytes! Interestingly, a four bit
binary word is called a 'nibble', because it is half a byte!
Various
instances where the terms 'bit' and 'bytes' are used. You must have come across
the terms bits and bytes when checking out capacity of data storage devices or
bandwidth of your Internet connection. The capacity of the computer hard disk
is given in giga bytes (abbreviated to GB) usually. A GB or gigabyte is a
billion bytes or eight billion bits. Data transfer rates are always mentioned
in bits. Internet is an ocean of bits and bytes.
Computer
chips are of two kinds: 32 bit and 64 bit. This denotes the amount of data that
can be processed by the chips or read at a time. The Internet bandwidth is
measured in kilobytes (thousand bytes) per second, that is 'kbps' or mega bytes
per second (MBPS).
No comments:
Post a Comment