This artist graffitied the worst tweets he could find, right outside Twitter's office.
The man's eyes bulge down at the ground. All around him, it is littered with spray-painted, stenciled messages.
Their contents, written in a mix of English and German, are vile. The skin crawls to read them.
"Let's gas some Jews together," reads one stencil.
"Gays to Auschwitz," reads another.
The nose automatically wrinkles just looking at them.
The man shakes his head. "I work in an advice center for Roma and Sinti so we're used to this kind of racist bullshit. But these..."
"These are all tweets," the cameraman says. "Tweets," the man repeats quietly. He seems shocked and a little sad.
For the past six months or so, German artist Shahak Shapira had been reporting and flagging inappropriate tweets like these to little avail.
Shapira used Twitter's reporting to flag about 300 of them, he told The Associated Press. Most of his complaints went unanswered, and whenever he'd check back, the majority of the tweets were still visible.
Feeling ignored and frustrated, Shapira decided there was one sure-fire way to get heard.
Early in the morning, Shapira and a crew of workers arrived on the doorstep of Twitter's German headquarters in Hamburg and proceeded to stencil roughly 30 of the worst ones right on their doorstep.
[rebelmouse-image 19531299 dam="1" original_size="750x381" caption="For a lot of people, this is already what social media feels like. Image from Shahak Shapira/YouTube." expand=1]For a lot of people, this is already what social media feels like. Image from Shahak Shapira/YouTube.
A video of people's reactions to the display was published on YouTube on Aug. 7, 2017. The camera captures a man in a collared shirt walking by. He gazes at the tweets. "It's just disgusting," he says.
For a distressing number of people, stumbling through hateful messages on social media is a daily struggle.
For some, scrolling through our social media feed is relatively benign. The problem of harassment or hateful language can seem unimportant — because it's easy to ignore a problem when you don't see it.
But the truth is that there are many people who do have to see these kinds of things every day. Harassment and online abuse have been a problem for years. A Pew Research poll found that as many as 4 of every 10 internet users has experienced harassment.
And if your job is linked to work on social media, as many of ours are these days, you don't really have a choice about whether you have to see this.
By thrusting this language into the real world, Shapira has made it hard to ignore. If people don't want to tolerate this graffiti on the streets, maybe we shouldn't tolerate it on the internet either.
You can watch the full video below:
Twitter did not comment on the "artwork," although the AP reports that by Aug. 9, about half of the stenciled tweets had been removed online.
Reddit tried an experiment to curb hate speech. The results are fascinating.
In 2015, Reddit decided to run some of the haters out of town.
Image by Rebecca Eisenberg/Upworthy.
The "homepage of the Internet," known for its wholesale embrace of free debate, banned several of its most notorious forums, including r/coontown, a hub for white supremacist jokes and propaganda, and r/fatpeoplehate, a board on which users heaped abuse on photos of fat people.
Critics accused the site of axing the subreddits for the "wrong" reasons — demonizing unpalatable speech rather than incitement to violence. Others worried the ban would be ineffective. Wouldn't the trolls just spew their hate elsewhere on the site?
Thanks to a group of Georgia Tech researchers, we now have evidence that the ban worked.
Their paper, "You Can’t Stay Here: The Efficacy of Reddit’s 2015 Ban Examined Through Hate Speech," found that not only did banning the forums prompt a large portion of its most dedicated users to leave the site entirely, the redditors who did stay "drastically [decreased] their hate speech usage."
The researchers analyzed over 650 million submissions and comments posted to the site between January and December 2015. After arriving at a definition for "hate speech," which they determined by pulling memes and phrases common to the two shuttered forums, they observed an 80% drop in racist and fat-phobic speech from the users who migrated to other subreddits after the ban. 20-40% of accounts that frequently posted to either r/coontown or r/fatpeoplehate became inactive or were deleted in that same period.
"Through the banning of subreddits which engaged in racism and fat-shaming, Reddit was able to reduce the prevalence of such behavior on the site," the paper's authors concluded.
The researchers have a few theories about why the ban may have worked.
Those who migrated to other subreddits, they speculate, became beholden to existing community norms that restricted their ability to speak hate freely.
Reddit co-founder and executive chairman Alexis Ohanian. Photo by Jerod Harris/Getty Images.
They also cite Reddit's effective removal of copycat forums (r/fatpeoplehate2, r/wedislikefatpeople, etc.) before they could reach critical mass.
Creating secure online spaces is a difficult problem. This new research provides at least one possible solution.
Any attempt to moderate an open web forum, the researchers argue, will inevitably have to balance protecting free expression with the right of people to exist on the internet without fear of abuse. A June Pew research poll found that 1 in 4 black Americans reported having been harassed online because of their race, compared with 3% of white Americans.
"The empirical work in this paper suggests that when narrowly applied to small, specific groups, banning deviant hate groups can work to reduce and contain the behavior," the authors wrote.
For vulnerable people who, like most, are living increasingly online lives, it's a small measure of relief.
Correction 9/13/17: This story was updated to identify Alexis Ohanian as Reddit's co-founder and executive chairman, not CEO.