Human Thought Process
Before progressing to the topic of how computers process information, an example of the way humans process numbers is in order. No I don’t claim to know about the intricacies of the human brain and I will bypass the fields of psychology, music and art. Computers are mathematical creatures; therefore, I must reference how humans think of and view basic math as well as the universal mathematical system we accustomed to.
The Decimal System
Millenniums before the invention of the wheel or the first spark ignited the tip
of a tree branch, cavemen and woman needed a means to count their possessions. The Neanderthals required a way to inventory their boar skins and elephant tusks. Behold, the answer was right at their fingertips and pointed to a way of expanding their future. The cave-dwellers discovered that one could correlate between the numbers of artifacts they owned and their fingers (I still do this when I am faced with a difficult calculation; such as, adding two apples.) Unfortunately, a problem arose. The Paleolithic era must have had a capitalist system of government and the cavemen (and women. I am not a sexist. I am a scientist) accumulated more inventory then they had fingers.
Once our ancestors ran out of fingers they must have discovered that toes were too difficult to continue their counting. Therefore, our distant relatives had to repeat the process of counting beyond the limitations their hands provided them. The single digits from zero to nine had to be depicted beyond ten fingers. (If gloves were created before numbers, the mathematics world as we know it would be very different.) As an advanced society of evolved beings the system of ten continues onward. We count to 9 and repeat the process once again by increasing the number of digits – One-zero, one-two, one-three then subsequently we move on to three digit numbers, two-zero, two-one yade yada yada, ad nauseam. This is what we know today as the decimal system, or what mathematicians refer to as the base 10 system.
Written in Stone
Of course, math is not done (except from a small group of people including myself) on fingers anymore. Writing media such as chisels and stone, feathers and paper and finally computers, made their way into the human race. When we perform math we tacitly perform our calculations from right to left, yet we verbalize numbers from left to right. We will later learn that computers are no exception. One-million, one hundred thousand and one would not be said as “One, one-hundred-thousand and million.” This would probably sound like verbal poison. Yet, when we add to (hopefully our fortune) we place our extra number to the right-most digit, “One Million, one hundred-thousand and two.”
The rightmost digit is the ‘1’s’ placeholder. It goes from 0-9 and then we must carry over the remainder to the next placeholder – the 10’s column. Subsequently to the 100’s column will be filled until it reaches its limit, the number nine then, as my teacher always said, “carry the one”. Every placeholder has its name as we see in the chart below.
An alternate view of this concept is powers (or exponents). What exponent would one use in the decimal system? Incremental powers performed on the base number ten.
|106||105||104||103||102||101||00 = 1|
The rightmost digit in the base 10 system is the 1’s placeholder. Since zero raised to the zero power is one. The proof for this would be beyond the scope of this article so, just as my firm belief that Santa Claus exists, I will ask you to go on faith.
First digit from the right: The one’s place 0^0 = 1 (the ‘^’ key also known as the carrot or hat, is often used to depict exponents)
Second digit from the right: The ten’s place 10^1 = 10
Third digit from the right: The hundred’s place 10^2 = 100
Fourth digit from the right: The thousand’s place 10^3 = 1,000
Fifth digit from the right: The ten-thousands place 10^4 = 10,000
Sixth digit from the right: The hundred-thousand’s place 10^5 = 100,000
Seventh digit from the right: The millionth’s place 10^6 = 1,000,000
This pattern goes on to infinity. When we raise the number ten to any power the exponent and the number of zeros correspond. I.e. 10^9 = 1,000,000,000
How Computers Think
Now that we know how we humans do count, we can easily comprehend how computers understand numbers.
We can think of a computer as a creature with only two fingers. As a reminder, counting in the scientific world always begins with zero. A computer’s digits, as in any number system, gets evaluated starting with the right-most bit (placeholder). Let’s look at the number one million again but this time in binary.
1111 0100 0010 0100 0000
Notice how we now used twenty digits to represent the number one million. One can see the problem this would create if one were to count money. They may think they are Bill Gates instead of just the humble millionaire. Therefore, one have to notate this with a subscript, in this case ‘2’ to let the world know that you can only afford a house instead of the Taj Mahal. 1111 0100 0010 0100 00002 = 1,000,00010
Notice that in the binary number for one million that I separated the digit into units of four. The reason for this is twofold. First, it makes it more human readable. Second, computers see information of bit. Each zero or one is a bit. Eight bits are a byte; therefore, we divide into half bytes (known as nibbles)because it lends itself into another base system known as hexadecimal.
This is, of course quite cumbersome, and when a computer scientist wants a shorthand what would they do? Select a different base naturally.
How a Scientist Thinks
Hexadecimal, or base 16 can be commonly seen if one looks closely at a computer screen when a memory error pops-up or when one is choosing a Hex color in Photoshop or CSS.
The hexadecimal digits are as follows…
If you happen to someday meet an alien from another planet and you shake his hand and, upon closer inspection, discover eight fingers on each hand, her world is probably on the hex system.
Let’s look one last time at our number one million to illustrate why hexadecimal has become such a useful shorthand notation.
1111 0100 0010 0100 00002
Notice how well each number in the Hexadecimal system corresponds beautifully to its binary counterpart.
|All 4 bits on||4’s bit on||2’s bit on||4’s bit on||All bits off|
This is a very simple view of how computers do basic math. It can get far more complicated than this but is how it all begins. A computer is comprised of zero’s and ones or better yet; higher voltage represents the 1’s and lower voltage representing the 0’s. In the very core of your computer, tablet or cellphone there is only high and low voltage and no middle ground. It is simpler than having ten different voltage rates to be kept track of. The binary system makes the present architecture much simpler and efficient.
In future articles I will delve deeper into the very core of computer architecture which is called logic gates, which determine when a bit is considered on or off. These use circuitry consisting of silicon and they use Boolean algebra to determine the numerical outcome on core level.
Of course, the next generation of computing will be quantum computing which, in oversimplification, uses a trinity system but that too will be for a future article.