+
upworthy

the media

OK, I know you don't want to, but try to remember back to the fall of 2016.

Remember when reality TV star Donald Trump was running for president? How about the moment you realized he might actually be electable?

In my social circle, at least, that realization was met with disbelief — as well as some legitimate panic. And as more of my friends started freaking out, I noticed a meme being shared over and over by them on social media.


In it, Trump on TV, in all his '80s businessman "glory," appears next to this quote:

"If I were to run, I'd run as a Republican. They're the dumbest group of voters in the country. They love anything on Fox News. I could lie and they'd still eat it up. I bet my numbers would be terrific."

Image via Rude & Rotten Republicans.

There are various versions of this meme, but I must have seen it at least a dozen times across Facebook and Twitter.

I assume we all know by now that it's a fake quote. But it was just as fake several years ago as it is now. My friends who shared it are smart people. Why would they share something without fact-checking it first?

After all, my friends aren't like General Flynn spreading that Pizzagate conspiracy theory about a child sex ring being run by the Clintons out of the nonexistent basement of a pizza parlor. (This was before he was hired as national security advisor, by the way.)

[rebelmouse-image 19534143 dam="1" original_size="700x400" caption="Image via Politifact." expand=1]Image via Politifact.

My friends also aren't Sandy Hook deniers clinging to the repulsive idea that the kids, parents, teachers, administrators, and community members in Newtown, Connecticut, were "crisis actors" who faked a mass shooting so the government could take away everyone's guns.

An Oxford study found that conservatives are more likely than liberals to spread misinformation, but no one is immune.

So, how does this stuff get shared so widely?

We live in a world where fake news spreads like wildfire, while the truth just ... smolders.

Last week, MIT released a study finding that fake news travels father, faster, and deeper than true stories do.

The study authors looked at approximately 126,000 tweets from 2006 to 2017 that were shared more than 4.5 million times by about 3 million people. They determined whether stories were true or false based on agreement between six different independent fact-checking organizations, and also looked at whether stories were being shared by automated bots or by humans.

What did they find? False information was 70% more likely to be retweeted than true stories. Ouch.

The authors found that it took the truth about six times as long as falsehood to reach 1,500 people, and that fake stories tended to reach far more people. Accurate stories rarely made their way to more than 1,000 people, whereas the top 1% of false stories regularly spread to between 1,000 and 100,000.

The study also found that automated accounts actually share false and true stories at the same rate, which means that humans — not bots — are responsible for the proliferation of fake news.

Ugh. Let's do better, humans.

Apparently, the truth is simply too boring to share.

One way the study authors explain these results is that people like novelty. Real news stories are picked up by most major news outlets, and if they're doing journalism right, those stories aren't sensationalized. The truth is everywhere, which removes the intrigue; false stories are alluring because they're new, exciting, and unique.

Fake news also frequently has a strong political or ideological bent. People are drawn to stories that confirm their biases and evoke emotional reactions, which explains both the meme with the fake Trump quote and the obsession with Pizzagate.

It's pretty easy to cover the bases of novelty, bias, and emotion when you make stuff up out of thin air.

But we don't have to fall prey to sensationalism or our personal biases.

It may seem impossible to make people realize they're spreading fake news, but I refuse to give up hope that most people can discern the difference between fact and fiction.

I think it boils down to all of us calming down a bit and embracing — and sharing — slower stories.

When the news cycle of the 2016 election reached peak frenzy, I did a little experiment: I sought out the most reliable, unbiased, factual news sources I could find. No 24-hour cable news channels. No outlets with a clear agenda. I scoured Media Bias/Fact Check to find sources with the most reliable reporting and the least amount of bias.

I also narrowed my television news watching to C-Span and PBS News Hour. And you know what? It was surprisingly unexciting. Even with all of the upheaval going on in our politics, hearing accurate reporting largely unfiltered and without commentary was, frankly, fairly boring.

But boring news is not a bad thing.

It's a wonder how someone can complain so much about fake news while also actively promoting it. Photo from the Daily Show's Trump Twitter Library via Scott Olson/Getty.

We can all do our part to uphold the truth by making use of various fact-checking websites like Politifact, FactCheck, and Snopes. Every meme or link from a source we're unsure of should be run through these checks. There are people out there doing the digging work, so we should take advantage of that.

But most importantly, we need to check our own reactions to stories. According to the MIT study, people's emotional responses to real stories and fake stories differ. The authors noted that "false stories inspired fear, disgust, and surprise in replies" while "true stories inspired anticipation, sadness, joy, and trust." Taking note of how we feel when we see a story can give us a clue as to its reliability.

Check memes and stories. Check your reactions to them. Check your biases.

We may not be able to stop the spread of fake news, but we can definitely keep from spreading it ourselves.