"If cells are the building blocks of life, transistors are the building blocks of the digital revolution. Without transistors, the technological wonders you use every day -- cell phones, computers, cars -- would be vastly different, if they existed at all.
Before transistors, product engineers used vacuum tubes and electromechanical switches to complete electrical circuits. Tubes were far from ideal. They had to warm up before they worked (and sometimes overheated when they did), they were unreliable and bulky and they used too much energy. Everything from televisions, to telephone systems, to early computers used these components, but in the years after World War II, scientists were looking for alternatives to vacuum tubes. They'd soon find their answer from work done decades earlier.
In 1947, Shockley was director of transistor research at Bell Telephone Labs. Brattain was an authority on solid-state physics as well as expert on nature of atomic structure of solids and Bardeen was an electrical engineer and physicist. Within a year, Bardeen and Brittain used the element germanium to create an amplifying circuit, also called a point-contact transistor. Soon afterward, Shockley improved on their idea by developing a junction transistor.
The next year, Bell Labs announced to the world that it had invented working transistors. The original patent name for the first transistor went by this description: Semiconductor amplifier; three-electrode circuit element utilizing semi conductive materials. It was an innocuous-sounding phrase. But this invention netted the Bell team the 1956 Nobel Prize for Physics, and allowed scientists and product engineers far greater control over the flow of electricity." - HowStuffWorks
Microchip
A diagram of an integrated circuit, or Microchip
"A microchip is a small semiconductor used to relay information via specific electrical characteristics. In some cases, the term can be used interchangeably with integrated circuit. The microchip is at the heart of many electronics, including computers, cell phones and even microwave ovens.
The first microchip is credited jointly to Robert Noyce and Jack Kilby in 1958. Though both were working for different companies and coming at the invention from slightly different angles, the two companies decided both had part of the overall answer and decided to cross license their inventions to come up with one unified piece of technology. After being demonstrated in 1958, it was first available commercially in 1961.
The technology was basic compared to modern standards. The first chip held one transistor, three resistors and one capacitor; modern ones commonly hold millions of transistors in a space smaller than a U.S. penny. The increase in smaller and smaller semiconductor chips has led to numerous other benefits. Beyond being used in electronic gadgets, they can be inserted into biological organisms as well." - wiseGEEKS
Microprocessor
A microprocessor
"A high-performance microprocessor is at the heart of every general-purpose computer, from servers, to desktop and laptop PCs, to open cell-phone platforms such as the iPhone. Its job is to execute software programs correctly and as quickly as possible, within challenging cost and power constraints." - North Carolina State University
"A microprocessor is a processor whose elements are miniaturized into one or a few integrated circuits contained in a single silicon microchip. It executes instructions. In a microcomputer, the central processing unit (CPU) is held on a single microprocessor. In order to function as a processor, it requires a system clock, primary storage, and power supply. Several important lines of microcomputers use some families of microprocessor chips." -Mississippi State University Intel and AMD are some examples of microprocessor brands.
GUI
A GUI interface
"When the Macintosh was introduced in 1984, it represented something altogether new to the public – an affordable Graphical User Interface (GUI) on a computer with a mouse. Suddenly, while others were typing commands like “del index.com,” Mac users were dragging and dropping the image of a file into the image of a trash can. Users had a computer with an interface that made sense (intuitive). But although Apple was the first to successfully mass-produce a GUI, they were not its inventors. The honor for producing the first working GUI goes to Doug Englebart – at the time an employee of Stanford Research Institute. Englebart and colleagues created a program called the oNLine System in 1965-‘68. This program used the first mouse, a windowing system, and hypertext, and was based on a description of a system called “memex” proposed by Vannevar Bush in 1945." - Interactive Media Research Laboratory Utah State University
Mouse
A computer mouse
"Each time you click your mouse, you're paying homage to a UC Berkeley College of Engineering alumnus. Douglas Carl Engelbart, who received his Ph.D. in electrical engineering in 1955, not only invented the mouse but also helped define the way in which we interact with personal computers to this day—from multiple windows to hypertext links." -Berkeley Engineering History and Traditions Douglas Engelbart
The Apple Lisa, and the Xerox Star, were one of the earliest products, available to the public, that had a mouse.