The Guardian: Bumble, Private Detector

A web developer asked men to send her pictures of their genitals in order to build a filter that ‘recognises’ a penis and blurs it. Which raises the question: why haven’t tech companies taken this on yet?

Earlier this month, after waking up to find an unwelcome dick pic in her Twitter account’s DMs, web developer Kelsey Bressler, 28, co-created an AI filter she claims is capable of preventing over 95% of sexually explicit images from reaching her inbox.

To test the filter, Bressler solicited pictures of male genitalia en masse, receiving hundreds to the trial account @ShowYoDiq, “for science”.

(Bressler is unsure of the exact final number of volunteer pics because she is still processing them manually and many respondents jokingly messaged her pictures of Donald Trump, which she must now weed out).

At this point, you may be wondering why men send women unwanted explicit pictures of their genitals. It varies, yet according to researchers at Kwantlen Polytechnic University in Canada, who released the first empirical study on the subject this summer, the behavior is linked to heightened levels of narcissism and sexism. “Men may find this exertion of power over women arousing itself, or they may find the shocked, hurt and angry reactions to be humorous or satisfying,” researchers wrote.

According to a 2017 Pew Research Study on cyber harassment, 53% of women ages 18 to 29 report that they have been sent an unsolicited lewd image online. While there’s nothing inherently wrong with sexy pictures – and senders may not always intend to cause harm – the fact that these images are shared without the recipient’s consent qualifies as sexual harassment. “It’s a violation … they’re just forcing it on you,” says Bressler of how it feels to receive unwanted pictures.

Bressler’s AI has already piqued the attention of at least one major social networking site, but the speed and ease with which her filter was created raises the question of why more tech companies don’t already use effective anti-harassment software, and why the measures they have taken to protect their users often seem half-baked.

Twitter, for instance, features a setting that, when activated, blocks images other users have flagged as “sensitive content” – but that won’t stop a dick pic from being sent directly to your inbox. (Twitter did not reply to a request to comment for this article.) And while you can block and report users who bother you on almost any platform, you can’t unsee the harassing images in the first place.

Part of the reason why platforms are reticent to crack down on the nonconsensual sharing of nudes is because, unlike flashing someone in the subway, the digital practice is not yet widely illegal, nor are companies required to protect their users. Even in cases of extreme harassment, such as that suffered by Grindr user Matthew Herrick, companies are able to defend their refusal to protect victims by citing Section 230 of the Communications Decency Act (CDA), which, because they are not technically publishers, absolves them of responsibility for the content their users share. Lawyer Carrie Goldberg recently told the Guardian that the CDA “is the enabler of every asshole, troll, psycho and perv on the internet”.

Thanks in part to Goldberg’s advocacy, 46 states now enforce laws against the distribution of nonconsensual pornography, or “revenge porn”, but the war on dick pics has just begun.

Last year, councilman Joseph Borelli, a Republican from Staten Island, co-sponsored a bill (yet to pass) that would make cyberflashing – sharing nudes via Apple’s AirDrop feature, which allows people to anonymously send content to other devices within a 10-meter radius – punishable by up to a $1,000 fine or one year in jail.

“In the old days, you had to have a long trench coat and good running shoes,” Borelli told the New York Times. “Technology has made it significantly easier to be a creep.”

Cyberflashing has been illegal in Scotland since 2010, and Singapore criminalized the practice this May. In the state of South Carolina, anonymously sending an “obscene, profane, indecent, vulgar, suggestive, or immoral” file without the recipient’s consent can be punished by up to three years of imprisonment.

As of 1 September, Texas also has an anti-lewd imagery law, stewarded by the Texas Republican representative Morgan Meyer and “feminist dating app” Bumble, which is headquartered in Austin.

“Safety and holding our users accountable online are two of the most important things that go into our day-to-day thinking about how we’re going to end misogyny,” says Bumble’s chief of staff, Caroline Ellis Roche, of the company’s interest in pursuing legislation.

“Having this bill where it is now illegal to send an unsolicited lewd photo in the state of Texas is like a deterrent, like adding stop signs to the internet,” says Roche of the law, which classifies sending an unwanted dick pic as a class C misdemeanor punishable by a fine of up to $500. “We’re working to take this to the federal level.”

Critics have pointed out the law may be difficult to enforce due to its nonspecific terminology, and that sending explicit images may be a first amendment right. As free speech is closely protected by courts, lewd imagery laws could be hindered by “intent to harm” clauses – meaning perpetrators could claim innocence by arguing they did not mean to harass the recipient of their explicit imagery, and rather thought a dick pic would brighten their day.

A safer future for women online may require both legislation and efforts from tech companies to meaningfully reduce cyber harassment and send the message that sharing nudes without consent is unacceptable. To that end, this month Bumble launched Private Detector, an image filter similar to Bressler’s, which blurs images it deems potentially sensitive so users may delete them without taking a look at their contents. This way, hopefully the only people looking at nudes online are those who want to be.

The Guardian: Bumble, Private Detector

The Guardian: Bumble, Private Detector