A tiny fraction of Twitter users spread the vast majority of fake news in 2016, with conservatives and older people sharing misinformation more, a new study finds. 

 

Scientists examined more than 16,000 U.S. Twitter accounts and found that 16 of them — less than one-tenth of 1 percent — tweeted out nearly 80 percent of the misinformation masquerading as news, according to a study Thursday in the journal Science. About 99 percent of the Twitter users spread virtually no fake information in the most heated part of the election year, said study co-author David Lazer, a Northeastern University political and computer science professor. 

Spreading fake information “is taking place in a very seamy but small corner of Twitter,” Lazer said. 

 

Lazer said misinformation “super sharers” flood Twitter: an average of 308 pieces of fakery each between Aug. 1 and Dec. 6 in 2016.  

  

And it’s not just that few people are spreading it — few people are reading it, Lazer said. 

 

“The vast majority of people are exposed to very little fake news despite the fact that there’s a concerted effort to push it into the system,” Lazer said. 

 

The researchers found the 16,442 accounts they analyzed by starting with a random pool of voter records, matching names to Twitter users and then screening out accounts that appeared to not be controlled by real people. 

 

Their conclusions are similar to those of a study released earlier this month that looked at the spread of false information on Facebook. It also found that few people shared fakery, but those who did were more likely to be over 65 and conservatives. 

​Boost to credibility

 

That makes this study more believable, because two groups of researchers using different social media platforms, measuring political affiliation differently and with different panels of users came to the same conclusion, said Yonchai Benkler, co-director of Harvard Law School’s center on the internet and society. He wasn’t part of either study but praised them, saying they should reduce misguided postelection panic about how “out-of-control technological processes had rendered us as a society incapable of telling truth from fiction.” 

 

Experts say a recent showdown between Kentucky Catholic school students and a Native American elder at the Lincoln Memorial seemed to be stoked by a single, now-closed Twitter account. Lazer said the account fit some characteristics of super sharers from his study but it was more left-leaning, which didn’t match the study. 

 

Unlike the earlier Facebook study, Lazer didn’t interview the people but ranked people’s politics based on what they read and shared on Twitter. 

 

The researchers used several different sources of domains for false information masquerading as news — not individual stories but overall sites — from lists compiled by other academics and BuzzFeed. While five outside experts praised the study, Kathleen Hall Jamieson, head of the public policy center at the University of Pennsylvania, found several problems, especially with how they determined fake information sites. 

 

Lazer’s team found that among people they categorized as left-leaning and centrists, less than 5 percent shared any fake information. Among those they determined were right-leaning, 11 percent of accounts shared misinformation masquerading as news. For those on the extreme right, it was 21 percent. 

 

This study shows “most of us aren’t too bad at circulating information, but some of us are determined propagandists who are trying to manipulate the public sphere,” said Texas A&M University’s Jennifer Mercieca, a historian of political rhetoric who wasn’t part of the study. 

leave a reply: