Summon her with a tap or a word. Yell at her if you want. First, she was the perky female assistant in your phone. Then, a woman’s voice in a little box on the table checking the weather. Now, she’s growing a virtual body that looks like a retro fantasy of female servitude. You don’t have to pay her. She doesn’t have any rights. In fact, she’s sub-human.
In the race to bring artificial intelligence into every corner of our lives, Big Tech is wiring the future with cultural signals that undermine the hard-won rights of women to be treated as humans and paid as equals. Female digital assistants hearken back to the bad old days when females did the work — and everyone else got the benefits.
It’s time to stop them.
When Apple’s Siri and Amazon’s Alexa hit the market, plenty of commenters pointed out the obvious — that they reinforce sexist attitudes. But that didn’t stop Bank of America from recently launching “Erica” to pay your bills, complete with ads emphasizing her “24/7” accessibility and subservience in language reminiscent of a cam girl: “Hi, I’m Erica,” she coos. “See what I can do for you.”
Sexism is extremely costly to women — not only psychologically, but economically. Social scientists have shown that women who grow up in places where sexist beliefs are more prevalent tend to earn less in their working lives. Women who are sexually harassed andobjectified suffer lost wages, career disruptions, narrowed opportunities and long-term financial consequences.
Generations of feminists have fought to get to a place where a woman in a business meeting isn’t automatically expected to fetch coffee. The #MeToo movement is driving home the point that a female employee is not a default sex object. But rather than reflect these advances, Silicon Valley is dragging us backwards.
In the movie “Her,” a man falls in love with his computer’s operating system, but grows frustrated because “Samantha,” voiced by Scarlett Johansson, doesn’t have a body. The makers of digital assistants know that customers — particularly male ones — may want to see what “her” body looks like. In programming stereotypical female bodies to go with the woman’s voice, they strengthen the links between femininity, subservience and sex.
Readers of the New York Times have recently seen ads touting Amazon’s Alexa that feature full pages of large, sensual women’s lips, poised to answer anything you ask. Software firm Autodesk has gone even further, rolling out a chatbot — a specially programmed robot that mimics human conversation — with a 3D avatar named Ava that looks like — what else?— a sexy young woman. She’s a woman of color, too, signaling that young women, particularly those with brown skin, are always available to fulfill your needs.
Ava is cleverly designed to mirror the customer’s emotions with her facial expressions. The movie “Her” hints at how fast a she-bot that mimics emotions can become a sex object: Within days of purchasing her, Theodore (Joaquin Phoenix) masturbates as he listens to Samantha’s voice. Imagine if she’d had a body!
The “chatbot is off to a hot start,” proclaims American Banker in an article on Erica and the rise of (mainly feminized) programs that mimic humans. No kidding: Female bots have long been standard for hook-up websites like Ashley Madison, the site for extra-marital affairs that got caught using them to lure male customers.
Maybe the she-bots are partly a reaction to a century of rapid changes in which women gained professional opportunities and more control over their bodies than they ever had in human history. The subconscious minds of a million Silicon Valley programmers — mostly male — seem to have devised a counteroffensive: If you can’t have a real woman who will cater to your every whim, buy a virtual one. It’s man-made culture on steroids.
Feminists like Simone de Beauvoir have long noted the tendency in patriarchal societies to link femininity to certain tasks and behaviors in order to lock in the secondary status of women. When these assumptions are written into social practices and consistently repeated, they seem natural: Women don’t act, they are acted upon. Men are the ones in charge, women are there to support. Females are available for whatever you desire.
What will it mean when a female executive gives a presentation and a sexy bot pops up on the screen? Or a woman asking for a loan at the bank is told to pull up “Erica,” the ever-ready assistant? What happens in the minds of kids who grow up yelling at the female avatars in their devices, or building sexual fantasies around them?
Companies that excuse programmed biases as simply a matter of customer preference do not get a pass when they seek to profit from harmful stereotypes.
So often, the solutions to problems created by Big Tech are complex and elusive. This one isn’t: Boycott the female bot brigade and tell companies to come up with artificial intelligence that isn’t Stone-Age stupid.
Lynn Stuart Parramore is a cultural historian who studies the intersection between culture, psychology and economics. Her work has appeared at Reuters, Lapham’s Quarterly, Salon, Quartz, VICE, Huffington Post and others. She is the author of “Reading the Sphinx: Ancient Egypt in 19th Century Literary Culture.”
This article was first published on NBC News' Think.