Laurie SpiegelOriginal Source: Wikimedia
The birth of computer music
The idea of using computers to make music first began to be explored as early as the 1950s. Pictured here is one of its pioneers, Laurie Spiegel (who recently contributed to the Hunger Games soundtrack) working on arguably the first digital synthesizer, the Bell Labs Hal Alles Synth. She wrote the software Music Mouse, which automatically creates an accompaniment based on the notes she played on the synthesizer’s keyboards.
Amiga Platform – ProTrackerOriginal Source: Wikipedia
Computer audio goes mainstream
Despite the early work of Spiegel and others, it was to be several decades before computers became a staple part of the music-maker’s arsenal. Up until the 90s, most personal computers weren’t equipped to handle digital audio processing – and those that could were expensive, and often bespoke, items.
Introduced in the late 80s, Commodore’s Amiga line of computers were some of the first consumer machines to feature sample playback capabilities: short sounds could be recorded and played back by sequencing programs, such as ProTracker (pictured).
Amiga Demoszene (1990)Original Source: YouTube
The demoscene
Tracker software, which sequences music in vertical rows and relies on code-like key commands for effects, became very popular in tandem with the demoscene – a competitive computer art movement in which participants created short audio-visual showcases of programming and creative skill.
Native Instruments – ReaktorOriginal Source: Native Instruments
Virtual wires
1989 saw the introduction of Audio Max, an offshoot of the popular Max modular programming language. While it ran only on a proprietary digital signal processing (DSP) card on the $10,000 NeXT Cube workstation computer, Audio Max (which later became MSP) gave programmers the ability to synthesize sounds in real-time inside the computer.
A more user-friendly (and accessible) option arrived in the shape of Generator, launched by Native Instruments in 1996. Renamed Reaktor in subsequent releases, it remains both a popular platform for commercial software synthesizers and a modular sandbox for creating and sharing new instruments and effects.
Reason Studios – ReasonOriginal Source: Reason Studios
Familiar sounds, familiar looks
With advances in home and professional computer power in the mid-late 90s, it became possible for virtual instruments running on computers to accurately emulate desirable hardware electronic instruments. Swedish developer Propellerhead’s ReBirth RB-338 program, for example, sought to model a number of classic Roland instruments from the 1980s. The company’s flagship program, Reason (pictured), was designed as a virtual studio. Like most music software of the time, Reason adopted a ‘skeuomorphic’ approach to recreating hardware workflows on-screen – even going so far as to link instruments and effects together via virtual patch cables at the “back” of its on-screen environment.
Arturia Buchla Easel VOriginal Source: Arturia
Reinventing the classics
ReBirth was not the only software that sought to bring the sound of prized (read: expensive or otherwise inaccessible) hardware instruments to the average computer musician. Aside from being more affordable, a key benefit of these digital recreations is that they are not bound by the same constraints as their real-world equivalents. Many classic analog synthesizers, for example, are monophonic (only one note can be played at a time), either for design reasons or simply due to cost. Adding polyphony to a digital recreation of such a synth is, relatively speaking, a great deal easier.
Pictured is the Buchla Easel V, a licensed software model of the Buchla Easel synthesizer available as part of Arturia’s V Collection of modeled virtual instruments.
Yamaha DX7 + DexedOriginal Source: Image composed by the author, original image from Wikimedia
Improved interfaces
As well as expanding on the capabilities of classic synthesizers, virtual versions offer developers the opportunity to improve upon the design limitations of the original hardware. 1983’s Yamaha DX7, now regarded as a classic FM (frequency modulation) synthesizer, was notoriously difficult to program; with a single slider for data input, only one of its 120 different parameters could be adjusted at a time. The free Dexed software synthesizer is inspired by the DX7, but makes use of a computer screen to lay out its controls in a more intuitive manner – delivering classic FM sounds with none of the hassle.
Native Instruments – KontaktOriginal Source: Native Instruments
Take a sample
As of the early 2000s, software samplers like Native Instruments’ Kontakt (pictured) gave musicians practically unlimited capabilities for manipulating, sequencing, and effecting recorded audio. Such programs make it possible to authentically recreate the sound of real-world instruments or to create entirely new ones.
Fairlight CMIOriginal Source: Fairlight US inc Photo Archive
Bigger, better, cheaper
While a full licence for Kontakt may cost a few hundred dollars today, it’s a far cry from the earliest hardware samplers both in terms of costs and functionality. 1979’s Fairlight CMI (Computer Music Instrument)was limited to a small number of very short sound samples and cost around $32,000 at the time.
Ableton ScreenshotOriginal Source: Ableton
Flat & clean
As virtual instruments began to mature in the 2000s, a new generation of musicians emerged with less (or no) familiarity with hardware electronic music making workflows. Programs such as Ableton Live (pictured), a groundbreaking digital audio workstation (DAW) for producing and performing music, recognised this and dispensed with skeuomorphism almost altogether. Its knobs, sliders, and buttons are minimalist and two-dimensional, while other parameters are controlled with text-entry fields or drop-down menus.
AKAI – MPK25Original Source: Wikimedia
MIDI control
Computers make the process of music production easier by enabling powerful and cost-effective virtual instruments, but a mouse and keyboard aren’t always the ideal setup for controlling things in the studio or playing live. That’s why MIDI (Musical Instrument Digital Interface), the standard protocol for connecting controllers (like keyboards) to electronic instruments and effects, has become integral to computer music-making. Pictured is an Akai MPK25. While laid out like a synthesizer, it doesn’t produce any sound of its own, but simply sends messages about notes and parameter changes to the computer.
Flow machine jam - 15th AugustOriginal Source: YouTube
Creative controllerism
While MIDI controllers initially came in the familiar form of keyboards, knobs, and faders, the prevalence of virtual instruments has fuelled the rise of new controllers dedicated to controlling specific pieces of software. Native Instruments’ software/hardware hybrid Maschine is a popular example that features 16 drum pads, a range of knobs and buttons, and a pair of screens – enabling musicians to navigate the software without a mouse or keyboard.
Here, musician and developer Tim Exile demonstrates a live performance with his custom configuration of MIDI controllers.
Laptop on stageOriginal Source: Photo by Vince Fleming on Unsplash
Laptops go live
Computers have become a familiar sight on stages since the first digital sequencers appeared in the 80s. In the 90s and 00s, advances in processing power allowed artists to make laptops the focal point of their live sets. Using dedicated performance software like Ableton Live or Apple Mainstage, or creating environments for performance in languages such as Max, Reaktor, or SuperCollider, artists are now able to condense whole studios worth of hardware into a portable live setup.
Output – Portal (Presets)Original Source: YouTube
New originals
Today, virtual instruments and effects provide an opportunity for new and novel takes on creating and processing sound – including features and workflows that would be difficult or even impossible with hardware. Output Audio’s Portal is an example of this new breed. it alters sounds using granular synthesis – a method of deconstructing audio into tiny “grains,” which is only possible using computers. Control of its various parameters is made simple by the X/Y pad, which also provides real-time visual feedback.
How The Internet's Steve Lacy Makes Hits With His Phone | WIREDOriginal Source: YouTube
Music-making goes mobile
We've seen how, in the span of forty years or so, computer-based music-making evolved from an academic pursuit to something within the reach of the average music-maker. In fact, in a sign of just how far technology has come, it's highly likely that the device you're reading this on – be it a smartphone, a desktop computer, or something in between – has more than enough power to produce a hit record.
In this video, Steve Lacy of The Internet demonstrates his production process, which revolves almost entirely around GarageBand – a free app for iPhone. Using this setup, he's produced hits for J. Cole, Kendrick Lamar, Mac Miller, Solange, and many others.