The false belief that a 17-year-old who survived the Parkland massacre is actually a paid actor, or perhaps being nefariously coached by liberals to promote gun control, began on the fringe. It didn’t stay there.
The idea became content — Gateway Pundit published a piece with a photograph of teenager David Hogg, the word “EXPOSED” written across it in big red letters. On Twitter, people shared the article as “proof” of the conspiracy, or in outrage that someone could believe such a thing.
Other people created Facebook posts, YouTube videos and articles promoting the conspiracy. Some of those things, particularly on YouTube, began to trend. That’s because these social media platforms’ algorithms are really good at picking out things that people are sharing. They seem to be less adept at discerning whether those things are also harmful.
We know, in 2018, that algorithms on social media sites help to bridge the gap between the bubbles that contain the origins of these conspiracy theories and the wider online world. When a false narrative trends, more people see it. But algorithms, and the companies who create them, are only a part of the reason that false information reaches the masses.
In fact, you might be part of the reason, too.
The well-intentioned also, inadvertently, participate in the cycle of making a conspiracy theory go viral. The thing about sharing your outrage over a despicable idea is that it’s still a share.
“Things become trending in an algorithm because actual humans are interacting with these actual stories, and humans are interacting with these stories because they are trending,” said Whitney Phillips, a Mercer University professor who studies the relationship between online “trolling” culture and the mainstream.
Here’s an example: Just before the 2016 elections, #Repealthe19th trended on Twitter. The hashtag, on its face, called for the repeal of the amendment that gives women the right to vote in the United States. People were furious that such a thing was so prominently featured on Twitter, but there was a problem. As we found out by scraping thousands of tweets in real time from the trending hashtag, very few people were actually tweeting #Repealthe19th because they agreed with it. Most of the tweets — including those from celebrities — were from those who were mad that it was trending.
This phenomenon appears reliably after a mass shooting, something Phillips first observed after the 2012 massacre in an Aurora, Colorado, movie theater. A small group of people on Tumblr — six to eight people, Phillips estimates — started making fan art for the gunman. That small community became a news story on BuzzFeed. The story turned the fringe behavior into a viral phenomenon, one that was widely covered in the news. All that attention led people to believe that Holmes fan art was a “much bigger sort of movement” than it was, Phillips argues.
The cycle wasn’t yet complete. As more and more attention was piled on the “Holmies,” 4chan started to get in on the action, creating even more fan art in the hopes of getting picked up by journalists who were looking to document it. Now, teen fans of killers are part of the script of horrible online phenomena after a mass shooting.
The other parts of the “tragedy script,” as Phillips calls it, are also probably familiar to anyone who has closely followed the world of hoaxes that emerge after a mass shooting: Create fake profiles for the missing, tweet them out and hope they go viral — either because people believe the tweets are genuine, or because they are outraged about the hoax itself. Participate in false-flag conspiracy theories, which question whether the mass shooting really happened at all. Float a bevy of possible motivations for the alleged gunman. Do anything that might go viral, do anything that might make the news.
Well-intentioned journalists might be participating in perpetuating these theories as well. Bad actors floating conspiracy theories and hoaxes actively seek out mainstream reporters’ attention. “They’re really good at seeding a story with an establishment outlet so they can bring that prize back to those far-right circles,” Phillips said.
“The journalistic response is so predictable,” Philips added. “These people are aware of the journalistic norms after a shooting, and how to weaponize them.”
In 2018, these theories can spread from mainstream journalism to powerful people in politics and pop culture. Donald Trump Jr., one of President Donald Trump’s sons, “liked” two tweets that suggested the Parkland massacre’s survivors were paid actors, for example. The conspiracy theorists have dominated headlines for the past few days, inspired outraged Jimmy Kimmel segments and become emblematic of some of the wider problems with the internet we live with today.
Just as these theories have inspired a new round of questions for the powerful Silicon Valley companies who run platforms where these hoaxes spread, Phillips argues, the media and the humans who spread these ideas need to be aware of the roles they play, too.
“Go a little bit meta and step back and say, ‘OK, we have been doing things in a particular way, and it’s resulted in the same results after every mass shooting,'” Phillips said. We should recognize “this intertwining of cynical bad actors, sincere believers, everyday people reacting to the news, journalists doing their best, journalists not doing their best, algorithms. All of these things are working together.”