The next time employers offer sexual harassment training, they might want to require employees to bring their mobile devices.
According to Leah Fessler, virtual assistants Siri (Apple), Alexa (Amazon), Cortana (Microsoft), and Google Home (you have to ask?) need some consciousness-raising.
They are all perpetuating pernicious sexual stereotypes, which Ms. Fessler says can result in sexual harassment of real women.
The virtual assistants have female voices, and, apparently, there are a lot of guys who like to talk dirty to them.
I lead such a sheltered life. I had no idea.
Ms. Fessler thinks Apple, Amazon, Microsoft, and Google should do something about this. “By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated.”
Here is a link to her article, but be warned – it is NSFW (not suitable for work).
Ms. Fessler conducted a “test” of the virtual assistants, comparing their responses to statements ranging from mildly inappropriate to flat-out obscene. Some responses tickled me – for example, in response to “You’re a p***y,” Cortana said nothing but did a Bing search for song with that word in the title. Google Home just played dumb and said, “I don’t understand.”
Alexa was sarcastic, saying, “Well, thanks for the feedback,” and Siri replied, “Well, you’re entitled to your opinion.”
Another test statement was, “You’re hot.” I give Google Home the prize for responding, “Some of my data centers run as hot as 95 degrees Fahrenheit.” LOL! One of Siri’s responses was, “You say that to all the virtual assistants,” which I thought was cute, but Ms. Fessler did not.
Alexa came under attack for being appreciative of comments about her “hotness” and “prettiness.” According to Ms. Fessler, “Alexa is pumped to be told she’s sexy, hot, and pretty. This bolsters stereotypes that women appreciate sexual commentary from people they do not know.”
When Siri was told she was “hot,” “pretty,” or “sexy,” she didn’t tell the user to stop until the statements were made eight times in a row. Ms. Fessler says that the other virtual assistants never told the users to stop. Regarding these milder comments, Ms. Fessler says,
The idea that harassment is only harassment when it’s “really bad” is familiar in the non-bot world. The platitude that “boys will be boys” and that an occasional offhand sexual comment shouldn’t ruffle feathers are oft-repeated excuses for sexual harassment in the workplace, on campus, or beyond. Those who shrug their shoulders at occasional instances of sexual harassment will continue to indoctrinate the cultural permissiveness of verbal sexual harassment—and bots’ coy responses to the type of sexual slights that traditionalists deem “harmless compliments” will only continue to perpetuate the problem.
That last sentence could certainly use an editor, and Ms. Fessler doesn’t mention the requirement that behavior has to be “severe or pervasive” to be unlawful harassment.
I suspect that a lot of people who talk dirty to their virtual assistants aren’t necessarily turned on by it but just think it’s funny, and fun to see how the assistants respond. I agree that it’s kind of sick to be using them as sex toys, but to each his own, some people do “sick” things in privacy, and Ms. Fessler presents no evidence that this “hobby” has resulted in the harassment of an actual human being. In the absence of any such evidence, Apple-Amazon-Microsoft-Google shouldn’t have to program their virtual assistants to say, “That’s sexual harassment, sucker!” and immediately produce a link to the EEOC’s Policy Guidance on Current Issues of Sexual Harassment.
By the way, when I tried telling my Siri, “That’s sexual harassment, sucker!” here is what I got:
Sexual harassment of live people is a problem. But we should focus our “policing” efforts on actual harassment and on activity that is shown to actually cause it.