Revolution

The First 2000 Years of Computing

Calculators: First Steps on the Path to Computers

Banks calculating interest. Kids dividing up cookies. Engineers designing bridges. We make calculations every day. And for as long as we've juggled numbers greater than our fingers and toes, we've sought aids to make computations easy and accurate. 

For centuries, calculators were the only machines to help us compute, forming a long lineage of devices stretching from the ancient abacus to today's digital computer.

The Versatile, Venerable Abacus

An American soldier and a Japanese postal worker faced off in Tokyo in 1946. Private Thomas Wood had an electric calculator. Kiyoshi Matsuzaki held a soroban, a Japanese abacus. Each was skilled at operating his device. In four out of five competitive rounds, the abacus won.





Perhaps the oldest continuously used calculating tool aside from fingers, the abacus is a masterpiece of power and simplicity. Abacuses were widely used in Asia and Europe for centuries, and remain common today.

Abacas, 1950, From the collection of: Computer History Museum
Show lessRead more

A small suan pan Chinese abacus, ca. 1950

Tap to explore

Punched Cards: From Math to Data



People used calculators to manipulate numbers. But how do you make machines that also manipulate words or ideas?





Punched cards, a mainstay of early office automation and computing, helped launch the transition from doing math to processing data. Patterns of holes punched in cards can represent any kind of information. Punched cards can preserve data too: just file them away!

Making Sense of the Census: Hollerith's Punched Card Solution

Nothing stimulates creativity like a good crisis.

The U.S. Constitution requires a census every decade. That was manageable in 1790 with fewer than four million Americans to tally. Not so simple a century later, with 63 million. Estimates warned that the 1890 census wouldn’t be finished before the 1900 census began!

The government’s answer? A contest to devise a solution. Herman Hollerith won. He suggested recording data on punched cards, which would be read by a tabulating machine.

Hollerith Electric Tabulating System (replica), 1890, From the collection of: Computer History Museum
Show lessRead more

Hollerith Electric Tabulating System, 1890 (replica: 1981) 

Tap to explore

Analog and Digital: Different Ways to Measure and Model the World

Our world is a symphony of infinite variations. Long before digital computers existed, engineers built models to simulate those real world nuances.







Analog computers continued this tradition, using mechanical motion or the flow of electricity to model problems and generate answers quickly. They remained the preferred tool until digital computers, based on electronic switches, became fast enough for software to do the same job.

Nordsieck's Differential Analyzer

Using $700 worth of surplus World War II supplies, Arnold Nordsieck assembled an analog computer in 1950. It was modeled on differential analyzers built since the 1930s—but with key differences.

For instance, Nordsieck’s computer used electrical connections instead of mechanical shafts. And he set himself the priorities of “convenience and simplicity…portability, and economy.” His device’s small size and straightforward engineering satisfied the first three requirements. Its $700 price tag satisfied the fourth.

Nordsieck's Differential Analyzer, 1956, From the collection of: Computer History Museum
Show lessRead more

Nordsieck Differential Analyzer, 1956

Tap to explore

Birth of the Computer: Computation Becomes Electronic

World War II acted as midwife to the birth of the modern electronic computer. Unprecedented military demands for calculations—and hefty wartime budgets—spurred innovation.





Early electronic computers were one-of-a-kind machines built for specific tasks. But setting them up was cumbersome and time-consuming. The revolutionary innovation of storing programs in memory replaced the switches and wiring with readily changed software.

ENIAC



In 1942, physicist John Mauchly proposed an all-electronic calculating machine. The U.S. Army, meanwhile, needed to calculate complex wartime ballistics tables. Proposal met patron.



The result was ENIAC (Electronic Numerical Integrator And Computer), built between 1943 and 1945—the first large-scale computer to run at electronic speed without being slowed by any mechanical parts. For a decade, until a 1955 lightning strike, ENIAC may have run more calculations than all mankind had done up to that point..

ENIAC programmers, 1946, From the collection of: Computer History Museum
Show lessRead more

Tap to explore

Early Computer Companies: The First Computer Companies

The stored-program electronic computer represented a breakthrough. But if you wanted one, you had to build it yourself. There were no commercial manufacturers.





As interest grew, both startups and existing companies gambled that making general-purpose computers for others might prove a viable business. Yet nobody knew how big the potential market was, whether such a venture was savvy or folly.

UNIVAC

Computing burst into popular culture with UNIVAC (Universal Automatic Computer), arguably the first computer to become a household name.

A versatile, general-purpose machine, UNIVAC was the brainchild of John Mauchly and Presper Eckert, creators of ENIAC. They proposed a statistical tabulator to the U.S. Census Bureau in 1946, and in 1951 UNIVAC I passed Census Bureau tests.

Within six years, 46 of the million-dollar UNIVAC systems had been installed—with the last operating until 1970.

UNIVAC I supervisory control console, 1951, From the collection of: Computer History Museum
Show lessRead more

UNIVAC I supervisory control console, 1951

Tap to explore

Real-Time Computing: Reacting to the Real World

Taking a census? You can wait while computers crunch the numbers. Braking your car? Guiding a missile? Running an assembly line? Waiting is not recommended. Time matters.





Real-time computing responds to events as they happen, something even early computers were able to do. Demand for real-time computing began with the military, but swiftly expanded to industrial, medical and soon, everyday uses.

A SAGE Defense

Fear of nuclear-armed Soviet bombers terrified 1950s America. SAGE, a massive real-time control and communications system developed for the Air Force by MIT’s Lincoln Laboratory, offered a solution.

SAGE (Semi-Automatic Ground Environment) linked 23 sites across the U.S. and Canada, coordinating weapons systems and processing radar, weather reports and other data. By the time it became fully operational in 1963, however, the principal threat had shifted from aircraft to missiles, making SAGE’s value questionable. Nevertheless, some sites remained in service until 1982.

SAGE weapons director console with light gun, From the collection of: Computer History Museum
Show lessRead more

SAGE weapons director console with light gun, 1958

Tap to explore

Mainframe Computers: The Backbone of Big Business

With technology, what you can do influences what you want to do—which gradually expands what you can do.





Businesses in the 1950s increasingly recognized computers’ broad potential. They demanded flexible, large-scale machines able to consolidate varied tasks. The workhorse mainframe computers that met these demands in turn reshaped how businesses operate, increasing centralization and nourishing new demand for powerful mainframes.

IBM System/360

IBM dominated computing in 1961, with about two-thirds of the American market. But could IBM hold onto its lead? Its product line was fragmented with incompatible machines, poorly suited to offer companies a single, unified, easily expandable system.

IBM’s System/360, a new family of general-purpose computers, changed everything. Programs for one System/360 computer ran on all, letting customers readily consolidate computing capabilities.

Every subsequent IBM mainframe is a descendant of the first System/360s.

IBM System/360 Model 30 computer, 1964, From the collection of: Computer History Museum
Show lessRead more

IBM System/360, 1964

Tap to explore

Memory & Storage: How Computers Remember

Computers are master jugglers, multitasking as we play music, solve equations, surf the web, and write novels. They also have become vast, searchable libraries of everything from banking records and encyclopedias to grandma’s recipes.





These abilities require two kinds of memory: main memory (fast and comparatively expensive) and storage (big, slower, and cheap). Both types have rapidly and continually improved.

The First Disk Drive: RAMAC 350

Computers hold thousands of data records. Imagine if finding the one you wanted required starting with the first, then going through them in order.

High speed, random access memory—plucking information from storage without plodding through sequentially—is essential to the way we use computers today. IBM’s RAMAC (Random Access Method of Accounting and Control) magnetic disk drive pioneered this ability.

The RAMAC 350 storage unit could hold the equivalent of 62,500 punched cards: 5 million characters.

RAMAC actuator and disk stack, 1956, From the collection of: Computer History Museum
Show lessRead more

RAMAC actuator and disk stack, 1956

Tap to explore

Supercomputers: The Fastest Brains for the Biggest Problems

Super is relative. Every era has supercomputers, but the definition shifts as technology advances. Today’s supercomputers may be tomorrow’s PCs.





Supercomputers tackle the most calculation-intense problems, such as predicting weather, codebreaking and designing nuclear bombs. Early supercomputers were one-of-a-kind machines for the government or military—the only customers who could afford it.

The Cray-1 Supercomputer

Featuring a central column surrounded by a padded, circular seat, the Cray-1 looked like no other computer. And performed like no other computer. It reigned as the world’s fastest from 1976 to 1982.

Its distinctive design reflected Seymour Cray’s innovative engineering solutions and theatrical flair. The round tower minimized wire lengths, while the distinctive bench concealed power supplies. Densely packed integrated circuits and a novel cooling system reflected Cray’s attention to “packaging and plumbing.”

The Cray-1 was 10 times faster than competing machines. But speed came at a cost. It sold for up to $10M and drew 115 kW of power, enough to run about 10 homes.

Over 60 miles of wire snaked through the Cray-1, with no segment longer than 3’ to minimize signal delays.

Cray-1A Supercomputer, 1976, From the collection of: Computer History Museum
Show lessRead more

Cray-1A Supercomputer, 1976

Tap to explore

Minicomputers: Less is More -  Smaller, Simpler, Cheaper 

This new kind of computer, smaller and simpler than mainframes, was designed to interact directly with users and the outside world. A flexible, inexpensive tool, it brought computers within the reach of a larger and more diverse range of customers.





Minis also sparked a new generation of computer companies. Competition accelerated innovation and reduced prices, spurring broad adoption.

DEC’s Blockbuster: The PDP-8

The Canadian Chalk River Nuclear Lab approached Digital Equipment Corporation in 1964. It needed a special device to monitor a reactor.

Instead of designing a custom, hard-wired controller as expected, young DEC engineers C. Gordon Bell and Edson de Castro did something unusual: they developed a small, general purpose computer and programmed it to do the job.

A later version of that machine became the PDP-8, one of the most successful computers of the next decade.

PDP-8 Minicomputer, 1965, From the collection of: Computer History Museum
Show lessRead more

PDP-8, 1965

Tap to explore

Digital Logic: How Digital Computers Compute

All digital computers work on the same principle: manipulating on/off signals to implement logic functions.





There have been many ways to generate those on/off signals, from mechanical devices to electromagnetic relays, vacuum tubes, transistors, and integrated circuits (ICs). This evolution brought ever-faster, smaller components, yielding dramatic improvements in capacity and cost that transformed computers from specialty tools to everyday devices.

Moore's Law

The number of transistors and other components on integrated circuits will double every year for the next 10 years. So predicted Gordon Moore, Fairchild Semiconductor’s R&D Director, in 1965.

“Moore’s Law” came true. In part, this reflected Moore’s accurate insight. But Moore also set expectations— inspiring a self-fulfilling prophecy.



Doubling chip complexity doubled computing power without significantly increasing cost. The number of transistors per chip rose from a handful in the 1960s to billions by the 2010s.

Moore's Law, From the collection of: Computer History Museum
Show lessRead more

Moore's Law graph

Tap to explore

Artificial Intelligence & Robotics: Trying to Make Computers Human

Mechanical servants. Automated employees. We’ve long imagined machines able to replicate human thought and action.





Computers provide the sophistication needed for human-like behavior. But getting machines to actually think like people has proved stubbornly elusive. It’s unclear how far we feasibly can go, but ongoing attempts to create Artificial Intelligence (AI) have yielded a vast array of beneficial products and services.

Shakey

Robots require intelligence to understand sensory input, make plans, and take actions. That makes them ideal for testing many AI concepts.

Shakey, developed at the Stanford Research Institute (SRI) from 1966 to 1972, was the first mobile robot to reason about its actions. Shakey’s playground was a series of rooms with blocks and ramps. Although not a practical tool, it led to advances in AI techniques, including visual analysis, route finding, and object manipulation.

Shakey, 1969, From the collection of: Computer History Museum
Show lessRead more

Shakey, 1969

Tap to explore

Input & Output: Human-Computer Interaction

Computers have always been good at calculations and data processing. But to evolve from specialized devices to a universal tool required more efficient ways to “talk” to people.





Early computers communicated primarily with coded text. Gradually they learned to use images. The development of graphical interfaces was key to creating powerful hardware and software systems that anyone could use.

Xerox Alto: Computers for “Regular Folks”

A mouse. Removable data storage. Networking. A visual user interface. Easy-to-use graphics software. “What You See Is What You Get” (WYSIWYG) printing, with printed documents matching what users saw on screen. E-mail. Alto for the first time combined these and other now-familiar elements in one small computer.

Developed by Xerox as a research system, the Alto marked a radical leap in the evolution of how computers interact with people, leading the way to today’s computers.

By making human-computer communications more intuitive and user friendly, Alto and similar systems opened computing to wide use by non-specialists, including children.

People were able to focus on using the computer as a tool to accomplish a task rather than on learning their computer’s technical details.

Alto I, 1973, From the collection of: Computer History Museum
Show lessRead more

Alto I CPU, 1973

Tap to explore

Computer Graphics, Music & Art: Computers and Creativity

Computers were originally devised to calculate. But they are increasingly used to create.





Computers have grown into a powerful medium for enjoying, sharing, and creating art, music, and film. We are also continually exploring and expanding the computer’s potential to generate new works, redefining the very idea of creativity and testing the boundaries of what it means to be an artist.

The Utah Teapot

Computers manipulate data. So, how do you get them to generate images? By representing images as data.

Martin Newell at the University of Utah used a teapot as a reference model in 1975 to create a dataset of mathematical coordinates. From that he generated a 3D “wire frame” defining the teapot’s shape, adding a surface “skin.”

For 20 years, programmers used Newell’s teapot as a starting point, exploring techniques of light, shade, and color to add depth and realism.

Utah Teapot, 1974, From the collection of: Computer History Museum
Show lessRead more

The Utah Teapot, ca. 1974

Tap to explore

Computer Games: Playing on Computers

To find the earliest computer games, find the earliest computers. Games have always been part of computing. Some were created for tests or demonstrations. Others merely reflect that computer pioneers were human—and humans play.





Games illustrate that “fun” or “entertaining” need not mean “simple.” Their increasingly sophisticated hardware and software mirrors the evolution and transformations in the complexity, power, and size of computers.

Pong: “The machine is broken.”

That terse message summoned Al Alcorn to Andy Capp’s bar in Sunnyvale two weeks after Alcorn had installed the Pong arcade game. Pong’s problem? Popularity. Its milk carton coin-catcher was jammed with quarters.



Pong heralded a gaming revolution. Mechanical arcade games like pinball had appeared the late 1800s. Pong, designed by Alcorn for Atari in 1972, launched the video game craze that transformed and reinvigorated the old arcades and made Atari the first successful video game company.

Pong prototype, 1972, From the collection of: Computer History Museum
Show lessRead more

Pong prototype, 1972

Tap to explore

Personal Computers: Computers for Everyone

Computers evolved primarily for military, scientific, government, and corporate users with substantial needs…and substantial budgets. They populated labs, universities, and big companies. Homes? Small businesses? Not so much.





Over time, however, costs dropped. Equally important, computers grew sophisticated enough to hide their complex, technical aspects behind a user-friendly interface. Individuals could now afford and understand computers, which dramatically changed everyday life.

The Apple II

When it debuted in 1977, the Apple II was promoted as an extraordinary computer for ordinary people. The user-friendly design and graphical display made Apple a leader in the first decade of personal computing.

Unlike the earlier Apple I, for which users had to supply essential parts such as a case and power supply, the Apple II was a fully realized consumer product. Design and marketing emphasized simplicity, an everyday tool for home, work, or school.

Apple II, 1977, From the collection of: Computer History Museum
Show lessRead more

Apple II, 1977

Tap to explore

The IBM PC

Many companies were dubious. Could small personal computers really be serious business tools? The IBM name was a reassuring seal of approval.

IBM introduced its PC in 1981 with a folksy advertising campaign aimed at the general public. Yet, the IBM PC had its most profound impact in the corporate world. Companies bought PCs in bulk, revolutionizing the role of computers in the office—and introducing the Microsoft Disk Operating System (MS DOS) to a vast user community.

IBM Personal Computer, 1981, From the collection of: Computer History Museum
Show lessRead more

IBM Personal Computer, 1981

Tap to explore

Taking it with You

Early computers were so heavy that the floor below sometimes needed reinforcing. Today, computers slip into purses or pockets and are misplaced as easily as keys.







Miniaturization and falling costs made it possible to take computers everywhere, and to merge them with devices like phones and cameras. Wireless communication over global networks weaves computing into our lives wherever we go.

The PalmPilot

The PalmPilot was the first wildly popular handheld computer. Its success helped bridge the previously separate worlds of the electronic organizer, the PC, and later, the mobile phone.

The PalmPilot succeeded by redefining the handheld as an accessory to the personal computer, not its replacement. Winning features included seamless one-button synchronization with the PC, handwriting recognition that really worked, easy-to-use organizer functions, fast responses, pocket size, and an affordable price of $299.

PalmPilot handheld computer, 1996, From the collection of: Computer History Museum
Show lessRead more

PalmPilot handheld computer, 1996

Tap to explore

Networking: Connecting Computers

Networking has transformed computers from stand-alone data-crunchers into the foundation of an unprecedented global community. Networking rests on a simple concept: getting computers to communicate with each other.





This requires a physical connection, like wires or radio links, and a common language (protocol) for exchanging data. Once these are in place comes the layer we see: information systems like the Web.

ARPANET: Networking Takes Off

In the late 1960s, the U.S. Department of Defense’s Advanced Research Projects Agency (ARPA) adopted packet switching for its ARPANET computer network. Although Larry Roberts at MIT had run trials for ARPA, this was the first large-scale experiment in general-purpose computer networking. It evolved as part of a fertile collaboration among government, academia, and industry.

But soon ARPANET wasn’t alone. If it weren’t for ARPANET’s massive funding from the Defense Department, Britain’s NPL network, or France’s Cyclades network, might have overtaken it.

Interface Message Processor (IMP), 1965, From the collection of: Computer History Museum
Show lessRead more

Interface Message Processor (IMP), ca. 1965

Tap to explore

Connecting People

Networks connect computers to each other. But how do people use those connections?



Information systems like the Web let us share content such as text, pictures, or music. The Web running over the Internet has become our global commons, absorbing older media—from video to books and telephone calls—and transforming how we work, buy, and stay in touch.

WorldWideWeb browser-editor

The first browser developed in 1990 was also an editor for creating a personal “web” of linked documents. Selecting text and clicking “Link to Marked” created a linked page. HTML and URLs were there, but hidden.

WorldWideWeb browser-editor, 1993, From the collection of: Computer History Museum
Show lessRead more

WorldWideWeb browser-editor, 1993

Tap to explore

Credits: Story

About the this exhibition—Revolution: The First 2000 Years of Computing is the Computer History Museum's major permanent exhibition. The Museum is dedicated to exploring the history of computing and its impact on society. The Museum would like to thank everyone who has made this institution a reality. 

Credits: All media
The story featured may in some cases have been created by an independent third party and may not always represent the views of the institutions, listed below, who have supplied the content.
Home
Discover
Play
Nearby
Favorites