At the heart of each personal computing device used today is a microprocessor. Microprocessors, also known as the CPU of a device, allow devices to compute various operations on a tiny chip. They do this by using digital logic as they operate on numbers and symbols represented by the binary system. Microprocessors work with electricity as the conductor of information. This means that they use pulses of electricity to represent everything you see on your computer. They do this by using logic gates, which take a number of inputs and then output 0 or 1 based on a predetermined rule. Although one gate can’t really do much, millions and millions of gates can achieve quite a lot, and today microprocessors can have hundreds of millions of logic gates and can perform millions of operations simultaneously. The design of the microprocessor is also what allows it to have this vast amount of processing power in such a small space. Each microprocessor contains an ALU, a control unit, bus systems, a control unit, and a clock.
However, before the first microprocessors were invented in the early 1970’s, there were other early technologies that were used to simulate logic functions, with many of them being due to more advanced and faster computing necessity during World War II. That being said, compared to today’s processors, these early technologies were costly, slow and were prone to breaking down and other technical issues.
The origins of microprocessors began in 1959 when Fairchild semiconductors invented the first Integrated Circuit. Two years later Texas Instrument and Fairchild were both selling Integrated Circuits on the market. In 1968 complementary metal oxide semiconductors were released and after that there was no doubt that technology would be evolving rapidly. This can be seen in the fact that the first Integrated Circuits in the 60’s only contained a few transistors per wafer, but by the beginning of the 70’s each circuit could contain thousands of transistors per wafer.
While there has been some uncertainty in the past over who first invented the microprocessor, today intel, but more specifically Federico Faggin, Stanley Mazor, and Ted Hoff are credited with the invention. Their product, the Intel 4004, was first released in November of 1971. The 4004 was the world’s first universal microprocessor, meaning that it could be used in any type of device. Bits were and still are the way power of microprocessors are measured. The 4004 was a 4 bit processor, which compared with today's 64 bit processors seems like nothing. A 4 bit processor can perform tasks on 4 bits at a time, which contain 16 different possible values. The 4004 also ran at about eight clock cycles per instruction cycle. This means it could execute around 92,600 instructions per second. 92,600 instructions per second equates to performing addition on two eight digit numbers in roughly 850 microseconds, or 1,200 additions on two eight digit numbers per second.
This speed was incredibly fast for the time and the 4004 was actually first used by a Japanese company, Busicom, in their calculators. Interestingly enough, Busicom was the company who had originally asked Intel to create the 4004. Busicom even owned the design of the 4004 but failed to take advantage of it and sold the rights back to Intel for $60,000. However, 4 bit microprocessors were just the beginning for the tech world and for intel.
Throughout the early years of microprocessor technology, Intel was one of the frontrunners and only one year after the release of their 4004 model, they released the Intel 8008. The 8008 was the world's first 8 bit microprocessor and although it was sometimes slower than the 4004 it beat it out in many other categories such as, the size of the internal CPU stack, amount of memory supported, and a more efficient instruction set. Around this time is when other companies began to become increasingly interested with the idea of microprocessors after seeing the success of intel's first two models. However, it was not until the release of the 8080 in 1974 that the processor revolution truly started.
When 16 bit processors were released, it was not just intel who was producing them, but rather a multitude of different tech companies who were all trying to compete. The Digital Equipment Corporation, Fairchild, Texas Instrument, and National all released versions of a 16 bit microprocessor. In fact National was the first company to release one, releasing the National Semiconductor IMP-16 in early 1973. However when Intel released their 16 bit Intel 8086 they began taking control of the market. This was due to the fact that the 8086 was considerably cost effective and could be used to port software from the 8080 chip. The 8086 is also known as the first member of Intel’s x86 family, which are still used today as the processors of most modern PC’s.
16 bit processors only spent a short period of time dominating the market before 32 bit processors were introduced. The most popular 32 bit processor was surprisingly made by Motorola and not intel. The MC68000, more commonly known as the 68k, was released by Motorola in 1979. Strangely enough Motorola described it as a 16 bit processor even though the architecture of the processor was clearly 32 bit, as it did use 32 bit registers. Its mass popularity stemmed from the fact that it had a large memory, cheap cost, and high performance. The 68k was used in some of Apple’s original Macintosh designs, Commodore's Amiga computer, and the Atari ST.
While Motorola and Intel both produced effective microprocessors, the architecture of their processors were very different. Intel used a separate I/O map and Motorola used a Memory Mapped I/O. In layman’s terms a separate I/O meant that to perform input and output operations to hardware special instructions were needed as they were all on different “maps”. Motorola’s Memory Mapped I/O meant that both the input and output operations and the memory were on the same map, so special instructions weren’t needed. By the early 1980’s there were numerous amounts of 32 bit processors being created, AT&T Bell Labs BELLMAC-32A, the HP Focus, and the NS 32032 to name a few. However, once again, soon after the release of 32 bit processors, innovation took place and 64 bit processors were released.
64 bit chips have been available since 1992 but it took some time before they became a part of mainstream computer use, even though they were used in many gaming systems in the early 90’s such as the Nintendo 64. While the 70’s and 80’s saw widespread competition between many tech companies, once the 64 bit processor was introduced the market has been dominated by Intel and AMD. The reason it took longer to incorporate 64 bit chips into modern computers was that they needed to be backwards compatible, meaning that a 64 bit chip could run 32 bit applications. Intel was finally able to release this in 2003 with their Intel 64, and ever since then 64 bit chips have become more and more prevalent in the computers of today.
While microprocessors seem to follow a pattern in their creation, 4 bit, 8 bit, 16 bit, there are actually a few other different types of processors that were created that were adaptations, or offshoots of whatever the current top of the line processor was. For example, RISC, or reduced instruction set computer. RISC processors are able to execute instructions in less memory cycles than normal processors. Although early RISC processors performed faster than their IC counterparts, today the difference is almost non-existent. That being said, they are able to perform at lower energy costs and therefore RISC processors are the prominent processors found in mobile cellphones and tablets.
Another offshoot of the typical microprocessor is the multicore processor. A multicore processor is just what it sounds like, a processor with two or more sections. While they are all contained on the same chip, each core executes instructions as if it was a separate processor. This allows a computer to run multiple tasks at once and will make the computer run faster. The computer will also use less power when running a multi core since it can turn off sections that aren’t being used or aren’t needed. Although one might imagine having two cores would make a computer run twice as fast, with two cores a computer will only run about 60-80% faster. Multicore processors are also much more difficult to manage and more expensive to produce, in addition to the fact that not all operating systems support them. Just like with the 64 chips, the two main competitors in the market for multicore processors are Intel and AMD. IBM did introduce the first multicore processor in 2001 with their POWER-4, but it wasn’t until the releases of Intel’s Pentium D and AMD’s Athlon X2 that personal computers began to use microprocessors.
The constant releases of these ever-improving microprocessors was actually predicted back in 1965 by a founder of Intel, Gordon Moore. He originally predicted that there would be a doubling every year in the number of components in an Integrated Circuit. However, in 1975, he changed his prediction to a doubling every two years. While Moore’s law has held true for the past few decades, the advancement of microprocessors has started to slow down and in 2015 Moore stated that his law would be invalid in about a decade or so. One of the key limiting factors is the design of the logic gates contained in the microprocessors. As the dimensions of devices become smaller and smaller, controlling the flow of electricity in such a small space becomes more difficult. Eventually the transistors, what logic gates are made from, will reach their limits of miniaturization. The size of atoms will become the fundamental barrier that will be impossible to pass using current technology. However, there is a way to solve this problem and the solution is quantum computing.
Quantum processors can be exponentially faster than regular processors as they can store either 1, or 0, like a normal processor, and any possible superposition of either 1 or 0. This means that quantum computers have the potential to be millions of times faster than current processors. As of now the largest quantum computer can only operate on 16 quantum bits, also called qubits, and most of the field is still purely theoretical. The main problem with quantum computing is quantum decoherence. A quantum system needs to be completely isolated from its environment so that there will be no interactions between it and the outside world, as if there is interaction the system will lose information and that will affect the results of whatever is trying to be accomplished. That being said the future of microprocessors looks promising, and there are new experiments and research being conducted around the clock in order to make microprocessors even faster and to help us solve problems that we previously lacked the computing power to solve.