Evolution of Computers

We will travel in time, from the first hand crank computer design to the present handheld phones and see how technology has changed.

This story was created for the Google Expeditions project by Vida Systems, now available on Google Arts & Culture.

Public Chromakey (2018-05) by RED PEPPER CREATIVE AND STENOGRAFFIA TEAMSTENOGRAFFIA

You most likely have a smartphone in your pocket right now. The fact that it even fits in your pocket is a true technological marvel if you realize that those tiny devices are based on gigantic machines the size of rooms just a few decades ago - the first modern computers. 

Tap to explore

Unrealized Ideas

Britain’s Charles Babbage is often credited to be the inventor of the first computer even though his designs didn’t even become reality in his lifetime. 

Tap to explore

Babbage introduced the concept of a “difference engine” a machine that can calculate polynomial functions to avoid human errors in computing — something that happened often at the time. He later designed the “analytical engine”, which had broader applications and was seen as the first “general purpose” computer.

Tap to explore

Difference Engine

In 1823, Babbage received funding for the difference engine, a crank-powered, early “computer” that was going to be able to calculate complex equations in astronomy and mathematics. The difference engine was not actually built until 1991.

Tap to explore

Charles Babbage

Babbage (1791–1871) was ahead of his time. The designs that he created for the difference engine and analytical engine required extremely precise metalworking that was not available to him in this era, and his designs were never realized.

Tap to explore

Analytical Engine

Babbage started the design for the analytical engine in 1835. It was going to be the first computer to have a “memory” and it would able to perform arithmetic operations along with more complex functions such as programming. 

Tap to explore

Clever Counting

A significant milestone in computing was achieved following the US 1890 census. Inventor Herman Hollerith used his Tabulating Machine, an electric device to tally and record data, to process the census information. 

Tap to explore

His Tabulating Machine took significantly less time than it would have taken for people to process the data, with less manpower. The Tabulating Machine became an inspiration for further progress in computing, with applications in inventory, accounting, and other fields for decades afterward.

Tap to explore

Tabulating Machine

The tabulating machine was inspired by train conductors punching holes at specific locations in paper to record data. The tabulator part of the machine recorded hole positions on paper from census data and tallied them for the census. 

Tap to explore

Herman Hollerith

Hollerith (1860–1929) invented the Tabulating Machine to process the data for the US 1890 census. He went on to be the founder of the Tabulating Machine Company, which later merged with multiple companies and is now known as IBM. 

Tap to explore

Pantographic Card Punch

Before the Tabulating Machine could start counting, information from census takers had to be transferred from census schedules to punch cards. The Pantographic Card Punch machine allowed a skilled operator to punch 700 cards a day.

Tap to explore

20th Century Revolution

The 20th century heralded a new age in computing, with progress happening faster than ever before. Starting with Alan Turing’s idea of a machine capable of computing anything, to the release of the first modern desktops, the 20th century’s pace of technological advancement propelled the usage of computers into the world.

Tap to explore

From just a concept into a machine that helped win the war and then eventually entered households, the computer has achieved many milestones in the 20th century. 

Tap to explore

The Turing Machine

The Turing Machine, designed in 1936, is a concept by Alan Turing, an accomplished WWII cryptographer. It can emulate any algorithm by reading and writing symbols, storing instructions as memory in the machine. Future computers are based on it. 

Tap to explore

Bombe

During World War II, the German army used a special coding machine called Enigma that created special codes for military instructions, which changed every day. Turing was able to create the Bombe, which simulated several Enigma machines at once in order to break the codes.

Tap to explore

Apple Lisa

Lisa (1983) was the first computer that is most similar to what we use today. It was the first desktop computer with a graphical user interface (GUI), which allows users to interact with graphics instead of just text. Lisa also had sound. 

Tap to explore

Less Is More

The advent of the 21st century brought about new developments and improvements to computers that are unfathomable to the people of the last century. Computers now are faster, hold more memory, cost less, and are much smaller than their predecessors.

Tap to explore

The difference and progress between computers within a few decades is truly astounding. This panorama will detail some major differences between the UNIVAC 1101, one of the early modern computers from the 1950’s, and a modern computer platform.

Tap to explore

Vacuum tubes (10 centimeters per transistor)

Glass tubes that had their gas removed created a vacuum. Those vacuum tubes contained electrodes for controlling electron flow and were used in early computers as switches or amplifiers. Producing quite a bit of heat, they were slow, expensive, and unreliable.

Tap to explore

Transistors (1 centimeter per transistor).

As one of the most important developments bringing about the personal computer revolution, transistors replaced vacuum tubes. Much smaller and consuming much less power, transistors helped make computers faster as well as more reliable and more efficient.

Tap to explore

Integrated Circuit (1 micron per transistor).

While the UNIVAC 1101 used bulky vacuum tubes to process and create data, the modern computer uses a smaller and faster silicone chip with transistors. With this change in technology, the weight and size of computers plummeted.  

Tap to explore

Microprocessor (10 nanometer per transistor).

Also known as a CPU or central processing unit the microprocessor is a complete computation engine that is manufactured onto a single silicon chip. The ultimate miniaturization, each transistor in a 10 nanometer microprocessor is only about 50 atoms wide.

Tap to explore

True Modern Marvel: The Smartphone

Perhaps the most mind-boggling device of this century to the eyes of someone from the last century is the smartphone. Besides the fact that it is connected to the internet — something that people back then had not even thought about — it is...

Tap to explore

Besides the fact that it is connected to the internet — something that people back then had not even thought about — it is technologically far advanced compared to even the most expensive and impressive computers in the middle of the century, like the Apollo Guidance Computer. All this can be attributed to the conception of the integrated circuits and Moore’s Law. 

Tap to explore

Incomparable Power

The iPhone 6, for example, can store up to 128 GB and has a processing speed of 1.4 GHz. It has 2,000,000 times the memory and processes instructions 120,000,000 faster than the computer used to guide Apollo to the moon!

Tap to explore

Incomparable Size

The iPhone 6 measures about 5 x 2 inches  and weighs 5 ounces. In comparison, early computers with vacuum tubes like ENIAC and UNIVAC filled out 40-ft. x 20-ft. rooms and weighed tons with significantly less power than the computer that took man to the Moon.

Tap to explore

Power and Heat

The ENIAC needed 150,000 watts to function, and that much power released significant heat — so much so that it could only operate in rooms with specialized air conditioning systems. In comparison, the iPhone 6 only needs 10.5 watts to fully charge! 

Tap to explore

The Evolution of Play

Finally, a hallmark of the computer’s advancement is its modern use for recreation in addition to professional and business purposes that prevailed early in its inception. Computers now can be used to watch movies and videos, as well as to listen to music.

Tap to explore

Of particular interest is its usage for video games, a phenomenon that was birthed from computer advancement itself. Just like the medium, video games have undergone a drastic transformation in just several decades.

Tap to explore

Bertie the Brain (1950)

The first ever publically demonstrated video game, the player and an artificial intelligence compete in Tic-Tac-Toe. While it had adjustable difficulty levels and garnered initial interest, it was eventually regarded as a novelty and abandoned. 

Tap to explore

Pong (1972)

Atari’s Pong was perhaps the first video game that reached mainstream popularity. It was an extremely simple game compared to modern standards: a rudimentary 2D simulation of table tennis. Its popularity ushered a new age of video games, popularizing home consoles. 

Tap to explore

The State of Modern Gaming

Video games today are unimaginable compared to a few decades ago. Millions of people play a single game together, in massive multiplayer online games; graphics and sound quality have become more lifelike than ever, and the content of games have reached unprecedented depths.

Credits: All media
The story featured may in some cases have been created by an independent third party and may not always represent the views of the institutions, listed below, who have supplied the content.
Explore more
Home
Discover
Play
Nearby
Favorites