"Alexa, what are you wearing?"
"Alexa, are you horny?"
"Alexa, how much do you weigh?"
Alexa — Amazon's freestanding virtual assistant — doesn't wear or own any clothes, she can't (physically) feel horny, and she doesn't weigh a single pound. Nevertheless, some people find it normal, and even funny, to pose questions like the above (and other Easter Eggs) to the popular voice-controlled helper. (Alexa started out as the machine-learned voice behind the brand's plain black speaker, Amazon Echo, and has since expanded to a full range of products.)
Would we be asking a virtual assistant the same questions if the voice were male?
Amazon's Alexa, Apple’s Siri, and Microsoft’s Cortana are some of today’s most well known virtual assistants. All are female, and all elicit an image of an assistant who is not just a woman, but a woman people can boss around, flirt with, and act inappropriately towards. Compound that with portrayals in the media — like this 2015 magazine cover showing female robots sitting at typewriters — and it all starts to feel like a big step backwards rather than one towards the future.
“If we want the computers to behave differently, we have to actually pay attention to how we build them so we don't just create mirrors of what society does,” says Rada Milhacea, a professor of computer science at the University of Michigan, who signed an open letter against that stereotype-enforcing magazine cover.
As the line between our technology and “the real world” blurs, how we create, portray, and treat our virtual assistants is a very real issue that needs to be addressed — now. And if we’re going to make a change, we need to start with how we build these AI in the first place.
A World Of Fembots
Why do our most frequently used virtual assistants have female voices to begin with? Reasoning ranges from research that says our brains like female voices to those who contend that it's a precedent that was set during World War II, when women's voices were used in cockpits. An Amazon Alexa spokesperson told us that when they tested Alexa, both men and women preferred a female voice. In iOS, you actually can switch Siri from her female default to a male voice (but do you know anyone who’s actually done that?).
Pop culture, per usual, has reinforced the stereotype of the sexy woman behind the screen, through movies such as 2004's I, Robot and 2013's Her.
And even though Siri and Alexa can fend off certain lines of questioning (ask Alexa if she's horny and she'll tell you, "This isn't a conversation I'm capable of having"), early marketing didn't help. In one Apple ad from 2012 — a few months after Siri launched on the iPhone 4s — Samuel L. Jackson goes through a series of commands, before telling Siri, at the end of the ad, that she can take the night off. Her response is submissive: "If you say so."
"We're basically training our kids that they can bark commands at a female and she will respond," says Ben Parr, the cofounder of chatbot company Octane AI, of our voice-controlled assistants. "
Even if these commands seem harmless on the surface — Siri and Alexa are not human beings — how we talk to them subconsciously affects how we talk to other people, Parr says.
No Gender, No Problem
So, why don’t we just make our AIs genderless? That’s exactly what the creators of newer virtual assistants, such as KAI, are doing. Despite its male name, KAI, a financial assistant, is one of the only truly genderless bots in existence. It is an "it."
"We wanted to do a genderless bot, not an assistant that was a continuation of what was already out there," says Dror Oren, the co-founder and VP of Product at Kasisto, the financial tech startup that created KAI. "As a company, we had a choice here. We could have KAI respond with flirts and funny jokes, or we could choose to answer with a funny joke, but not a flirt."
KAI is funny in a geeky, know-it-all way. It won't take any bullshit, and hates when people try to waste its time. Insult it, and it will treat you like a parent might treat a petulant child: "I'm picturing white sand and a hammock. Try me again when you're ready."
But KAI has an advantage over some of its gendered AI counterparts: It doesn't have a voice. It converses with you via text only. (Google's personal assistant, which is technically genderless, still has a decidedly female voice.)
Still, KAI is being regarded as one of the first truly feminist forms of artificial intelligence. It isn't alone in its genderless distinction — of the more than 11,000 chatbots Facebook Messenger supports, many, including Assist and the Kayak bot, lack "he" or "she" attribution. But many of those skip having a personality altogether. In this respect, KAI stands out — and that's largely due to who's working behind the scenes.
"I noticed that many AIs had female names and [did things like] apologise for their presence, behave demurely and passively, and sometimes agree and apologise when insulted by users," says Jacqueline Feldman (below), the woman who is writing KAI. "AI designers have been using women as a shorthand for service and helpfulness. I'm a woman and I can be very helpful but that's not all I am."
Writing A Woman? You Need A Woman
As research has pointed out, people prefer the tonality of a female voice. That’s okay: We can still have female assistants, we just need to ensure gender bias issues don’t creep in. And to do that, you need women to write those bots.
Melinda Gates has been one of the most outspoken proponents of the need for larger numbers of women working in artificial intelligence.
"AI is already transforming healthcare, transportation, education, and the way we live our everyday lives," Gates told us. "That’s why it’s so important that we have diverse talent building it. If the people behind it all look the same and have similar experiences and perspectives, the technologies they are creating will be built to serve only a small group of people rather than our complex and diverse world. For technology and AI to benefit us all, we all need to have a role in creating it."
"For technology and AI to benefit us all, we all need to have a role in creating it."
It's well known that we have a "women in tech" problem — there are aren't enough women in tech to begin with. But women in AI, specifically, is an even bigger problem. At a large tech conference this summer, Gates said that only 17% of computer science graduates are women. And, according to Bloomberg, only 13.7% of the attendees at one of 2015's biggest conferences on artificial intelligence were female. Researchers in the field have referred to artificial intelligence as having both a "Sea of Dudes" problem and a "White Guy Problem."
If we only have men writing the code behind female virtual assistants that hundreds of thousands of people use on a daily basis, there is no way that those assistants are going to accurately represent a female voice.
Shatter The Ceiling
Unfortunately, the AI field isn’t going to suddenly be populated with an influx of women overnight. Until then, it’s up to us to give the companies that develop AI feedback when it crosses the line into gender stereotypes and sexism.
"We do a lot of tweaking of how Alexa responds to a variety of questions, both culturally and factually, as time goes on," said the Amazon's spokesperson. "To do this, we use a combination of customer feedback, and overall research on how Alexa is received by customers." You can similarly give Apple and Microsoft feedback on their AIs.
KAI, too, is constantly changing and adopting language to answer the inevitable non-banking queries, like those about the election.
This year, we didn't get our first woman in the White House. But as we continue to make strides, it’s time we ensure our virtual assistants evolve beyond the role of the tech-savvy, 1960s secretary. We need to do this by putting more women in the driver’s seat, giving feedback ourselves, or removing gender from the AI equation altogether. If you’re barking commands at your phone, it doesn’t need to be a woman who leaps to do your bidding. But if it is, she should be a 21st-century woman.
Like what you see? How about some more R29 goodness, right here?