Art Palette Experiment
Explore the experiments at the crossroads of art and technology, created by artists and creative coders with collections from Google Arts & Culture partners
Whether helping physicians identify disease or finding photos of “hugs,” AI is behind a lot of the work that Google does. And here at the Google Arts & Culture Lab in Paris, we’ve been experimenting with how AI can be used for the benefit of culture. We’re sharing our latest experiments – prototypes that build on seven years of work in partnership with 1,500 cultural institutions around the world. Each of these experimental applications runs AI algorithms in the background to let you unearth cultural connections hidden in archives – and even find artworks that match your home decor.
Putting Machine Learning to work for culture #GoogleArts
Art Palette Experiment
Explore the World in Color With Art Palette
From web to interior design, color schemes play a fundamental role in creating cohesive user experiences, establishing brand identities and communicating moods or emotions. So we thought – why not investigate color palettes in art? Find artworks that match your chosen color palette through Art Palette.
Feed your creativity with colors on Art Palette #GoogleArts
Irises (May 1890 - 1890) by Vincent van GoghVan Gogh Museum
Art Palette works as a search engine that finds artworks based on your chosen color palette. Using this tool, you can explore how the same five colors from Van Gogh's Irises can be related to a 16th century Iranian folio or Monet's water lilies.
Art Palette can also help creative experts in art, design and beyond to make informed choices regarding color palettes, understanding the context and history behind each one.
LIFE Tags Experiment
LIFE Tags
Browse through the 20th century via LIFE Tags experiment, defined by Machine Learning, by Gaël Hugo.
LIFE Tags Experiment
Beginning in 1936, LIFE Magazine captured some of the most iconic moments of the 20th century. In its 70-year-run, millions of photos were shot for the magazine, but only 5 percent of them were published at the time. 4 million of those photos are now available for anyone to look through. But with an archive that stretches 6,000 feet (about 1,800 meters) across three warehouses, where would you start exploring?
The experiment LIFE Tags uses Google’s computer vision algorithm to scan, analyze and tag all the photos from the magazine’s archives, from the A-line dress to the zeppelin. Using thousands of automatically created labels, the tool turns this unparalleled record of recent history and culture into an interactive web of visuals everyone can explore.
So whether you’re looking for astronauts, an Afghan Hound or babies making funny faces, you can navigate the LIFE Magazine picture archive and find them with the push of a button.
MoMA & Machine Learning
Identifying Artworks Through Machine Learning
Starting with their first exhibition in 1929, The Museum of Modern Art in New York took photos of their exhibitions. While the photos documented important chapters of modern art, they lacked information about the works in them.
To identify the art in the photos, one would have had to comb through 30,000 photos – a task that would take months even for the trained eye. The tool built in collaboration with MoMA did the work of automatically identifying artworks – over 20,000 of them – and helped turn this repository of photos into an interactive archive of MoMA’s exhibitions.
Discover the results of our collaboration between MoMA and Google Arts & Culture.
Art Palette Experiment
Want to try our experiments?
Play with color with Art Palette. Browse the LIFE Magazine archive with the LIFE Tags experiment. Explore an interactive archive of MoMA's past exhibitions with the results of our collaboration with MoMA. Discover more experiments from Google Arts & Culture.
You are all set!
Your first Culture Weekly will arrive this week.