❗Let's close the gap: We still need your help to raise $40,000 by April 1. Donate now

Are smart speakers enforcing gender inequity?

David Brancaccio, Sasa Woodruff, and Rose Conlon Aug 1, 2019
HTML EMBED:
COPY
Leon Neal/Getty Images

Are smart speakers enforcing gender inequity?

David Brancaccio, Sasa Woodruff, and Rose Conlon Aug 1, 2019
Leon Neal/Getty Images
HTML EMBED:
COPY

By 2020, we could be having more conversations with digital voice assistants than with our spouses, according to a U.N. report released in May.

Many of these assistants have female voices — think Alexa, Siri and Cortana. Some experts warn that as these kinds of female voice assistants become more commonplace, they teach us that it’s OK to issue orders to women.

Marketplace recently discussed the issue with Alison Greenberg, co-founder and CEO of conversation design studio aflow.

Many pieces of consumer tech skew male [in usage]. What do we know about smart speakers?

We don’t know about the user base of smart speakers like Alexa, Google Home, and the new Amazon Shows that have screens. Who is using these? Is it predominantly men or predominantly women? There’s conflicting evidence.

But what we know for sure is that these smart speakers are growing in adoption, there are more of them in homes than ever before, and just about all of us who have smartphones have a smart speaker in our pocket.

These are the interfaces through which brands and organizations will continuously communicate with us. And that communication is going to be bi-directional; no longer are brands and organizations talking at us. We can talk with them.

What does the U.N. say about female voice assistants?

A U.N. report released in May states that the gender bias — we all know is alive and well — unfortunately, in 2019, is being replicated from the physical world into the digital world by A.I. speakers and assistants which are predominantly female, like Alexa, Siri and Cortana, encouraging us to issue orders to women.

So children are being raised issuing orders to Alexa. And this is obviously dangerous because we are the sum of our behaviors.

She is there to do what you tell her. Sometimes you ask nicely, but you don’t have to. And so there’s concerns from communities interested in eliminating gender bias and closing things like the wage gap and all kinds of equity issues. These voice assistants are in some ways training us — and training our children — that it’s OK to treat women as subservient.

The U.N. was issuing a call to us to use conscious practices in designing systems for A.I., and in making sure that our principles of conversation design are inclusive.

For us at aflow, and for me as a female CEO, this is critical. We needed the U.N. to back what we’re already doing — which is saying that A.I., as an industry, is around 70% or 80% male.

How do the ways we speak to Alexa or Siri percolate out into the real world?

It’s this shift into what’s appropriate and what’s not. It’s not appropriate to issue orders to a woman you’ve never met, but that’s what we’re doing with Alexa.

In your work at aflow, you design conversations like the ones we have with voice assistants. How do you incorporate this thinking about gender bias into the way you design a conversation?

It’s easy to incorporate this thinking when you’re a woman. We just aren’t interacting with very many A.I. systems that were built by women. Simply by being a woman, I’m integrating a perspective that isn’t represented proportionately.

But my co-founder is a man, Seth Miller, our head of product, represents the male perspective. By being 50/50, along with other members of our team who are of different genders and of different races, we’re able to be inclusive by being representative. That’s one way to do it.

The second way is our philosophy of conversation design. You’ll see a lot of conversations with something like Alexa are built in a linear way, they’re mapped out in a decision tree. It’s an easier way to build a conversation — however, it’s not an accurate way to build a conversation.

At aflow, we call it “circular conversation design,” because we’re designing for the kinds of conversations where we could talk about gender or about weather or about the industry at large. If you think about a traffic circle, you could get on and get off anywhere you like. That’s how a powerful, robust and truly helpful conversation looks.

There’s a lot happening in the world.  Through it all, Marketplace is here for you. 

You rely on Marketplace to break down the world’s events and tell you how it affects you in a fact-based, approachable way. We rely on your financial support to keep making that possible. 

Your donation today powers the independent journalism that you rely on. For just $5/month, you can help sustain Marketplace so we can keep reporting on the things that matter to you.