Most of us interact with some form of artificial intelligence every day, whether it’s asking a smart speaker about the weather or being assigned shift work or served content by an app. But how many of us consider our relationship to those algorithms?
Marketplace’s Kimberly Adams spoke with Noelani Arista, an associate professor of history and classics and chair of the Indigenous studies program at McGill University in Montreal. She collaborates with other Indigenous scholars to examine and develop AI models. Adams asked her about an Indigenous approach to these algorithms. The following is an edited transcript of their conversation.
Noelani Arista: We are working from a sort of different foundational base of knowledge rooted in our communities, the lands and seas of our territories where we live and where we’re from. The basis of that, as we’ve been developing it or discussing it amongst ourselves, is relationality, how we relate to knowledge as part of a longer sort of stream of tradition within our own communities.
Kimberly Adams: This is a pretty different way of looking at the data that feeds into an artificial intelligence or an algorithm because what you’re talking about is sort of this more relational consideration or humanizing consideration. Can you just explain that concept a little bit more and how it shapes what an AI might deliver?
Arista: Part of my area of expertise is the early 19th century Hawaiian history, at the moment where Hawaiian people meet people from European and America-becoming. And so the technologies that they bring to us, print and writing technology, or the palapala, or reading and writing, is a real sort of, kind of technological transformation in my community. But where most historians assume or most scholars assume that writing and print is sort of like a progression or progress, Hawaiian people use both. They use writing and reading to augment, and print to augment, their oral practices so that we still have the movement of orality or performance that is embodied, like in dance or chant. And this is a way of history telling and keeping and making in my community, alongside writing and print. And so what we’re seeing now with computers is, OK, how do we keep our traditions moving through these corridors as technology changes and transforms? So we’re really interested in keeping the sovereignty of our knowledges intact through those transformations.
Adams: For those of us who use AI on a daily basis, from the Alexas in living rooms to the algorithms that create digital art from our photos, how might it affect us if we don’t consider our approach to this technology as we use it?
Arista: Well, some of what we write about is what languages are these technologies available in? If part of our process as, in my community, is to try to foster the continuation and the development of more Hawaiian language being spoken in the home, then the everyday technologies that we could purchase that are plug-and-play are not, they’re missing that for our community. So our children are engaging with technologies that consistently develop their abilities in colonial languages, and not the Indigenous languages that we are trying to bring back home.
Adams: Can you walk us through an example of an AI or technology developed with a relational or kinship-centered approach?
Arista: I can tell you about some that are probably in the works, or some that have probably already been done. The first three maybe [augmented reality, virtual reality] experiences that I had, were actually not in English. They were in Anishinaabe, they were in Maori. So being immersed in 360 engagement with virtual reality, but in your native language, in the spaces or lands that those languages are spoken in, gives us a different sense of being in that territory. So the example of Hawaii is case in point. When you say “Hawaii” to most people, they immediately think of paradise, they think of beaches, they think of warmth, and they think a whole lot of about sort of exoticizing things about Hawaiians, people and our culture. But if we were to do an immersive experience, and I think I engage one called “Piko” by the Hawaiian filmmaker Chris Kahunahana, who spent entire days and evenings sitting in the shadow of Mauna Kea, and cycling and allowing sort of the viewer to see the manner in which the clouds and rain pass and how the Earth changed over those cycles and how the stars wheeled overhead. In the sky, you get a sense of the greatness that Hawaiian people imagined the universe they were living in to be. And Hawaiian ancestors, they navigated that vast Pacific Ocean, the Moananuiākea, using the stars, using the wind, using the rain. Hawaii is a place that every valley has named winds and rains. And that’s a different kind of orientation to being at home or to being in Hawaii than most people can imagine. And so using those technologies to sort of give people that sense, even when they can’t read or speak Hawaiian, or they rely on a translator like me or other people who are experts to bring that experience to them in the language that they’re used to, that’s what technology can give us. There are some people doing, I think, in Maori, where you can move a phone over a landscape, and you can scroll through all of the different time periods with images. There are some doing that for climate change, people setting up 360-degree cameras that run the entire year, to see how a swamp or a wetland transforms and changes and all of the birds and insects and plants and animals that thrive and then die as the seasons pass. Like, this kind of experience that Native people, who for generations lived on these lands and waters, have been trying to communicate in ways that have not been well received by settler governments or peoples. Technology can allow us to really illustrate and bring those things home in immersive environments. And, of course, we want to use those technologies to teach our kids.
Adams: So often, people of color are put upon or asked to be the translators for their communities, which can be exhausting, as I’m sure I don’t have to tell you. Does developing AI like this help kind of offload some of that burden once it’s done to give people who are trying to learn a different conduit in that it doesn’t put the burden on you or members of your community to do that kind of translating and explanation?
Arista: Well, I mean, I’m glad you framed it that way because you’re correct. So I’d rather actually flip the script on that and create those technologies for the people who are asking me to do the burdensome work. Something like an AI, Alexa, or some kind of chatty personality to practice language with. What it does for the learner is it removes the guilt or the burden of shame from the practice mode. If we have sort of basic language taught through that medium, then it helps the learner strengthen that language capacity on their own time, on their own terms, and then builds that courage to then go outside and be in community and practice because they’ve had those range of emotions with AI that they think may not be judgmental, but we put that upon ourselves.
Adams: Because of colonialism and colonialist presentations of Indigenous communities, so often discussions about technology and Native communities are framed as the settler community has the technology and the Indigenous community does not. And that’s not accurate. And I wonder how, when you are going out looking for funding, I imagine you’re running into that, and how you push back against the sort of older, more racist narratives?
Arista: Yeah, I mean, I’ve actually just finished the editing stage of an essay called “Mali intelligence: Indigenous data sovereignty and futurity.” We’re actually trying to answer those questions head on. And I talk about how the work that everybody’s doing now is nearly a metonym for “futurity.” And as you pointed out, we’re always left in the past. So my answer to that was, after working on this working group, was to recruit these students to the Ph.D. program at McGill in computer science because we need more experts in our community working face to face with communities. And because we don’t have one of each yet, we will want to recruit more people into the programs so that they can go into communities and do the face-to-face work. That’s the most difficult. And then, we’ll have more Indigenous people working in computer science that can slowly shift those conceptuals. I’m a historian, I work with Indigenous language archives. So I’m working on that front, which people like to sort of discipline as the humanities. And I’m holding hands with the men working in computer science. And we imagine at one point, that there will be no disciplining as science or humanities. And we will be able to bring back our knowledges that don’t see those things as separable. My ancestors who chanted their way across that vast ocean, who managed to sail across the Moananuiākea using the stars, using their sciences and their chance. These are not siloed into different disciplines for us.
Adams: One of the questions you’ve posed in your own work is: How do we design AI systems that acknowledge that Indigenous knowledge is not in the past? I wonder how you personally answer that question.
Arista: In that essay that I wrote, I sort of assert that, for us, the past is not past, it’s presence. The future is made possible by the past. This is why history and technology go hand in hand for me, they’re not separable. You can’t disentangle those things for us. They’re inextricable. So for my formulation of history is based on relationality, based on seeing these things as relationships, and history is futurity in Hawaiian.
The future of this podcast starts with you.
Every day, the “Marketplace Tech” team demystifies the digital economy with stories that explore more than just Big Tech. We’re committed to covering topics that matter to you and the world around us, diving deep into how technology intersects with climate change, inequity, and disinformation.
As part of a nonprofit newsroom, we’re counting on listeners like you to keep this public service paywall-free and available to all.