When UNESCO approached me about creating a piece for their language preservation initiative, my first reaction was excitement. My second was fear.
How do you make art that carries the weight of a culture’s survival?
Learning to Listen Before Speaking
The project, which became Mother Tongue, involved five endangered languages: Ainu, Sámi, Mapuche, and two others whose communities preferred privacy. Before I touched a line of code, I listened. I sat with elders over video calls, listened to poems and folktales, and let the rhythms and tones sink in.
These weren’t just “data points” — they were lifelines.
Teaching AI Respect
I trained a multimodal AI on these languages’ sound patterns, scripts, and cultural symbols. The visuals were never meant to be literal translations. Instead, they became interpretations — morphing textures, shifting light patterns, shapes that echoed the cadence of the words.
One Mapuche elder told me that the piece “felt like the language was breathing under water.” That was the best compliment I could have hoped for.
The First Exhibition
When the piece premiered online, I worried that viewers wouldn’t connect without understanding the languages themselves. But they did. The audio played alongside the visuals, and people responded to the emotion, the flow, the feeling — even if the words were unfamiliar.
In Vienna, at the Digital Humanities Conference, I watched a room full of strangers lean forward in their chairs, pulled in by something they couldn’t quite name. That moment confirmed it: you don’t have to understand every word to feel the heartbeat behind it.
What It Meant to Me
Mother Tongue reminded me that technology is only as meaningful as the stories it carries. AI isn’t here to replace human expression — it’s here to amplify it, to hold it up to the light so more people can see it.
And when it comes to languages on the edge of extinction, every extra pair of eyes, every extra set of ears, matters.