We are searching data for your request:
Upon completion, a link will appear to access the found materials.
Try to imagine life without the microprocessor. There would be no personal computers, no multimedia displays, and no internet. Our cellphones wouldn't exist, nor would our cars operate as they do.
Luckily for us, the microprocessor was invented, and we have Dr. Marcian "Ted" Hoff, among others, to thank for that. Hoff was born in Rochester, New York, in 1937, and he received a B.S. in electrical engineering from the Rensselaer Polytechnic Institute, followed by an M.S. and Ph.D. from Stanford University.
RELATED: NEW FLAW IN 737 MAX 8 COMPUTER FOUND THAT COULD SEND PLANE INTO NOSEDIVE
As a 17-year-old, Hoff was a Westinghouse Science Talent Search finalist (now the Regeneron Science Talent Search), and while still an undergraduate at Rensselear, Hoff applied for his first two patents while working summers at the General Railway Signal Corporation in Rochester.
In 1968, Hoff joined the Intel Corporation as employee number 12. At the time, Intel had a contract with Japanese company Busicom to develop a set of integrated circuits for an electronic calculator. Hoff came up with the idea of creating a universal processor on a single microchip, rather than a set of custom-designed circuits, and the microprocessor was born.
What is a microprocessor?
A microprocessor is a central processing unit (CPU) on a single integrated circuit chip that contains millions of tiny components including transistors, resistors, and diodes. A CPU operates on numbers and symbols represented in the binary number system, is multipurpose, clock-driven and register-based, and it accepts binary data as input. It processes that data according to instructions stored in its memory, then outputs its results as binary data.
The precursor to the microprocessor was the metal-oxide-semiconductor field-effect transistor, or MOSFET. This was developed in 1960 at Bell Labs, and quickly achieved higher transistor density and lower manufacturing costs. By the late 1960s, there were hundreds of transistors on a single MOS chip.
This increase in complexity is predicted by the famous Moore's Law, which was an observation published in a 1965 paper that states that the number of transistors able to fit on a dense integrated circuit doubles about every two years, which is a compound annual growth rate (CAGR) of 40%.
Moore's Law was proposed by Gordon Moore, who was a co-founder of Fairchild Semiconductor, and went on to become CEO of Intel.
The first commercial general-purpose microprocessor
As the electrical engineer responsible for the Busicom design, Ted Hoff thought that it could be achieved with a 4-bit central processing unit (CPU) on a single microchip. Working with Stanley Mazor, a software engineer, and with Busicom engineer Masatoshi Shima, the three made progress during 1969, but it was in 1970, when Intel hired Italian engineer Federico Faggin, that things really heated up.
Back in 1903, Nikola Tesla had patented electrical logic "gates", or "switches". Faggin developed the silicon gate technology (SGT) while working at Fairchild Semiconductor. In 1971, combined with the work of Hoff, Mazor, and Shima, Faggin used SGT to create the first microprocessor — a single-chip CPU that had the proper speed, power dissipation, and cost to make it of practical use. This was the 4-bit Intel 4004.
By 1972, Intel had contracted with Computer Terminals Corporation of San Antonio, Texas to develop a chip for a terminal - the Datapoint 2200. Intel came up with an 8-bit microprocessor, the 8008, but they were slow to deliver it, and CTC went with their own design.
To avoid paying Intel the $50,000 they owed for the chip, CTC instead turned the chip's ownership over to Intel, which turned out to be a really costly mistake. The 8008 was the predecessor of the ultra-successful Intel 8080, which was released in 1974, and the Z80, which was released in 1976.
The home computer revolution and today
It was these microprocessors that created the home computer revolution which occurred during the early 1980s. Inexpensive home computers, such as the Sinclair ZX81, which sold for $99, and the Commodore 128, which used an 8502 chip, came on the market.
Another type of microprocessor,the WDC 65C02 designed by the Western Design Center, Inc., was released in 1982. This was the CPU used in the Apple IIe and IIc computers, and it was also incorporated into implantable cardiac pacemakers and defibrillators, as well as automotive, industrial, and consumer devices.
In 1980, Marcian Hoff was named the first Intel Fellow, and he remained at Intel until 1983, when he became Vice President of Technology at game company Atari. In 1997, Hoff, along with Faggin, Mazor, and Shima, were awarded the Kyoto Prize, and in 2009, along with Faggin and Mazor, Hoff was awarded the U.S. National Medal of Technology and Innovation Award.
Today, microprocessors are used in televisions, DVD players, microwaves, toasters, stoves, clothes washers and dryers, dishwashers, refrigerators, security systems, stereo systems, home computers, hand-held game devices, thermostats, video game systems, alarm clocks, home lighting systems, electronic toys, and computer peripherals such as printers, to name just a few.
Microprocessors are used in cars, boats, planes, trucks, heavy machinery, gasoline pumps, credit card processing units, traffic lights, elevators, computer servers, most high tech medical devices, digital kiosks and doors with automatic entry. They're even used in those greeting cards that play music when you open them.
So, the next time you wake up to a perfectly brewed cup of coffee, you can thank the microprocessor in your coffee maker, and Marcian Hoff for inventing it.