10.7 C
New York
Friday, November 8, 2024
HomeNewsHow Fake News Spread Soon After YouTube Shooting

How Fake News Spread Soon After YouTube Shooting

Date:

Related stories

Health concerns rise for astronaut Sunita Williams amid extended space mission

NASA astronaut Sunita Williams, currently aboard the International Space...

Indian-American business icons rally behind Trump’s victory, pledge stronger US-India ties

Following Donald Trump’s recent victory, key Indian-American leaders and...

Opinion polls underestimate Trump support again as he secures battleground wins against Harris

Opinion polls underestimated the level of Donald Trump's support...

Six Indian Americans elected to US House of Representatives in historic Congressional win

In the recent U.S. midterm elections, six Indian Americans...

Within minutes of the shooting at YouTube offices in California, social media was awash with conspiracy theories and images of the supposed “shooter” wearing a Muslim headscarf. Some Facebook videos were quick to claim that it was a “false flag” attack, carried out to discredit the powerful US gun lobby in the wake of the Parkland high school massacre in Florida.
With wildly exaggerated accounts of the death toll circulating, several pictures of the purported attacker and some of the “victims” posted to Twitter Tuesday turned out to be of well-known YouTubers. Other widely-shared posts speculated that the attacker had been provoked by YouTube censoring political content, and one Twitter user posted a picture of the suspect as Hillary Clinton in a headscarf.
Hoaxers too took advantage of the situation to post several pictures of the US comic Sam Hyde, who is known for internet pranks. None of which came as any surprise to researchers at the Massachusetts Institute of Technology, whose report last month found that false news spreads far faster on Twitter than real news — and by a substantial margin. We found that falsehood diffuses significantly farther, faster, deeper, and more broadly than the truth, in all categories of information,” said Sinan Aral, a professor at the MIT Sloan School of Management. They found that false political news reached more people faster and went deeper into their social networks than any other category of false information.
While Russian troll factories have got much of the blame for attempting to poison the political discourse in election campaigns across the US and Europe, the team from the MIT Media Lab found that fake news spreads not because of bots but from people retweeting inaccurate reports. Researchers found that “false news stories are 70 percent more likely to be retweeted than true one. It also takes true stories about six times as long to reach 1,500 people as it does for false stories.”
While real news stories are rarely retweeted by more than a thousand people, the most popular fake news items are regularly shared by up to 100,000. Emma Gonzalez, one of the Parkland students who has become a leader of the #NeverAgain movement pushing for tougher gun control, has become a particular target for misinformation attacks in recent weeks. A doctored picture of her ripping up the US constitution trended last week, exposing her to vicious online vitriol. She had actually been ripping up a gun target in a photo shoot for Teen Vogue magazine.
Another fake meme went viral showing Gonzalez allegedly attacking a gun supporter’s truck, when it was in fact an image of the then shaven-headed pop star Britney Spears in a infamous meltdown from 2007. Rudy Reichstadt, of the Conspiracy Watch website, said disinformation feeds on the “shock and stupor” that traumatic events create.
“We now have conspiracy theory entrepreneurs who react instantly to these events and rewrite unfolding narratives to fit their conspiratorial alternative storytelling.” He said US shock jock and Infowars founder Alex Jones, a prominent pro-gun activist, had set the template for generating fake news to fit a particular agenda. He plays up “conspiracy theories every time there is a new shooting,” Reichstadt told AFP. “He is a prisoner of his own theories and is constantly trying to move the story on (with new elements) to keep the conspiracy alive.”
The France-based researcher said there was now a whole ecosystem of fake news manufacturers, from those who “use clickbait sensationalism to increase their advertising revenue to disinformation professionals and weekend conspiracy theorists who sound off on YouTube.” The MIT study, which was inspired by the online rumours which circulated after the Boston marathon attack in 2013, focused on what it called “rumour cascades” — unbroken chains of retweets after a Twitter user makes a false claim. Aral said they concluded that people are more likely to share fake news because “false news is more novel, and people are more likely to share novel information. Those who do are seen as being in the know.”

Subscribe

- Never miss a story with notifications

- Gain full access to our premium content

- Browse free from up to 5 devices at once

Latest stories

LEAVE A REPLY

Please enter your comment!
Please enter your name here