Explore the experiments at the crossroads of art and technology, created by artists and creative coders with collections from Google Arts & Culture partners
Want to try our experiments?
Play with color with Art Palette.
Browse the LIFE Magazine archive with the LIFE Tags experiment.
Explore an interactive archive of MoMA's past exhibitions with the results of our collaboration with MoMA.
Whether helping physicians identify disease or finding photos of “hugs,” AI is behind a lot of the work that Google does. And here at the Google Arts & Culture Lab in Paris, we’ve been experimenting with how AI can be used for the benefit of culture. We’re sharing our latest experiments—prototypes that build on seven years of work in partnership with 1,500 cultural institutions around the world. Each of these experimental applications runs AI algorithms in the background to let you unearth cultural connections hidden in archives—and even find artworks that match your home decor.
The selection of artworks are from Google Arts & Culture, shared by museums and archives around the world. (Due to the limitation of some devices performances you should open one experiment at a time).
Discover Art Palette.
From web to interior design, color schemes play a fundamental role in creating cohesive user experiences, establishing brand identities and communicating moods or emotions. So we thought – why not investigate color palettes in art?
Art Palette works as a search engine that finds artworks based on your chosen color palette. Using this tool, you can explore how the same five colors from Van Gogh's Irises can be related to a 16th century Iranian folio or Monet's water lilies.
Art Palette can help creative experts in art, design and beyond to make informed choices regarding color palettes, understanding the context and history behind each one.
Discover the LIFE Tags experiment.
Beginning in 1936, LIFE Magazine captured some of the most iconic moments of the 20th century. In its 70-year-run, millions of photos were shot for the magazine, but only 5 percent of them were published at the time. 4 million of those photos are now available for anyone to look through. But with an archive that stretches 6,000 feet (about 1,800 meters) across three warehouses, where would you start exploring?
The experiment LIFE Tags uses Google’s computer vision algorithm to scan, analyze and tag all the photos from the magazine’s archives, from the A-line dress to the zeppelin. Using thousands of automatically created labels, the tool turns this unparalleled record of recent history and culture into an interactive web of visuals everyone can explore.
Identifying Artworks Through Machine Learning
Starting with their first exhibition in 1929, The Museum of Modern Art in New York took photos of their exhibitions. While the photos documented important chapters of modern art, they lacked information about the works in them.
To identify the art in the photos, one would have had to comb through 30,000 photos—a task that would take months even for the trained eye. The tool built in collaboration with MoMA did the work of automatically identifying artworks— over 20,000 of them—and helped turn this repository of photos into an interactive archive of MoMA’s exhibitions.