How machines absorb human associations between words

A look at Google PAIR’s project, Waterfall of Meaning.

By Barbican Centre

Proposal image, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

“Google PAIR’s project Waterfall of Meaning is a poetic glimpse into the interior of an AI, showing how a machine absorbs human associations between words.”

Developed for the Barbican's exhibition AI: More than Human, the project 'Waterfall of Meaning' was created by Google PAIR which is an initiative devoted to advancing the research and design of people-centric AI systems.

Whiteboard sketching, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

The starting point

Once the Barbican reached out to PAIR about having an installation at their More than Human exhibition, the team knew they wanted to show visitors machine learning in a way that would be very relatable. Language is something that everyone uses and associating words is something humans do and a machine learning model learns.

Early designer's sketch, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

Early ideas for how the words progress down the waterfall.

Early projection test, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

The project in a nutshell

Language is subtle. Words may have multiple meanings and the same word can be used in different ways. This flexibility speaks to the richness of human language, full of associations and subtext. We may subconsciously consider certain words to be new or old, good or bad, male or female. When machines “read” text--from books, articles, letters--they start to learn how language is used and to pick up on the multiple associations that may exist. For example: how are the words “caviar” and “pizza” used? Do they show up in the same context? Are they used in the same way? 

Long exposure, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

Waterfall of Meaning uses technology called ‘word embeddings’ to analyse millions of existing English sentences and map words’ meaning based on their use. When a word crosses one of the axes, its location shows where it falls on the spectrum: more or less male or female, for instance.

The code behind the Waterfall, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

The technology

Waterfall of Meaning is based on a technology in which a machine learning system analyzes millions of sentences to create a geometric "map" of word meanings. Such maps, known as "word embeddings," have become common in modern AI software. However, unlike a conventional map (which exists in two dimensions) or a globe (in three), the word embedding used in this piece exists in a space of hundreds of dimensions. Word embeddings are like dictionaries for computers. Just as a person can look up the meaning of a word in dictionary, computers look up the word in word embedding to understand its meaning.

Axes brainstorming, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

These dimensions help the machine represent some of the subtleties of language usage: in a sense, it's a way of transforming meaning into math. Certain directions in a word embedding map may reflect contrasts such as female vs. male, or good vs. bad. Understanding how these implicit dimensions form is currently a subject of great interest, both as a scientific question and as a type of transparency, helping us peer inside the black box of this type of AI. But we can also view the embedding a model of how humans have collectively used words, giving us a way to measure connotations quantitatively. In the end, this piece is not so much a portrait of a machine but a picture of how humans speak.

Pre-design waterfall, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

To this end, any biases that are found in the data will be reflected in the model as well. For this reason, it's paramount to ensure that these models are not being used in ways that enforce existing biases, and to continue developing ways to de-bias these models.

Whiteboard ideation, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

The process

PAIR has done a significant amount of work in word embeddings prior to this exhibit including the embedding projector, a tool for visualizing embeddings in high-dimensional space. Members of the team have also done work in this area. Tolga Bolukbasi discovered machine learning bias by exploring and quantifying biases through word embeddings. Been Kim created TCAV, a method of quantitatively interpreting a model’s output using embeddings.

Early design idea, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

Exploring the Meaning of Human Word Usage

"We were motivated by the biases that exist within languages, and are culturally propagated through the written and spoken word. Our prior work in word embeddings had given us a way to quantify and visualize biases learned by models through word embeddings, and we wanted to find a way to share this with the world - invite people to discover, explore, play and reflect on them."

Early white and black design, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

"We picked axes that were predominantly opposites across cultures - male and female; good and bad; cheap and expensive, to name a few. Surprisingly, we found that male and female were not consistently opposite across cultures."

Early design idea, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

"We started with some quick explorations to see how best to visualize specific words, answering simple questions like 'is the color 'red' more male or female?', and 'is 'potato' cheap or expensive'. Inversely, we also explored how many associated words behaved across our axes.

We found several commonly known biases - nurses was almost always more female than male; some unexpected ones - zombie and robot are more female but monster and demon are more male; and even some funny ones - why are 'bagels' weak?"

Having fun with early iterations, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

"What started to quickly catch our attention is how the same word was biased along different axes, and in relation to the words around it. We explored various animation styles to show how the words flowed through the axes along their own currents of meaning."

Person immersed in words, Waterfall of Meaning (2019/2019) by People + AI Reseach (PAIR)Barbican Centre

"To execute this idea, we precomputed thousands of word embeddings by sending words through a pretrained neural net and extracting the activations. We then visualized these embeddings and projected them on a large area (3m by 6m) to give the viewer the feeling of being immersed in these words and their meanings."

Team photo, People + AI Reseach (PAIR) by People + AI Reseach (PAIR)Barbican Centre

The team behind 'Waterfall of Meaning':

Nikhil Thorat, Martin Wattenberg, Lauren Hannah-Murphy, Emily Reif, Tolga Bolukbasi, Mahima Pushkarna, Fernanda Viegas.

Credits: Story

People + AI Research (PAIR) is devoted to advancing the research and design of people-centric AI systems. We're interested in the full spectrum of human interaction with machine intelligence, from supporting engineers to understanding everyday experiences with AI.

Our goal is to do fundamental research, invent new technology, and create frameworks for design in order to drive a human-centered approach to artificial intelligence. And we want to be as open as possible: we’re building open source tools that everyone can use, hosting public events, and supporting academics in advancing the state of the art.

AI: More Than Human is a major exhibition exploring creative and scientific developments in AI, demonstrating its potential to revolutionise our lives. The exhibition takes place at the Barbican Centre, London from 16 May—26 Aug 2019.

Part of Life Rewired, our 2019 season exploring what it means to be human when technology is changing everything.

Credits: All media
The story featured may in some cases have been created by an independent third party and may not always represent the views of the institutions, listed below, who have supplied the content.
Explore more
Related theme
AI: More than Human
Exploring the evolving relationship between humans and technology
View theme
Google apps