The past several weeks have been difficult for the social media behemoth Facebook, with a series of whistleblower revelations demonstrating that the company knew its signature platform was exacerbating all manner of social ills around the globe, from human trafficking to sectarian violence.
The tide shows no sign of receding. New revelations this week have demonstrated that the company’s supposed commitment to freedom of expression takes a back seat to its bottom line when repressive governments, like Vietnam’s, demand that dissent be silenced. They showed that Facebook knew its algorithms were steering users toward extreme content, such as QAnon conspiracy theories and phony anti-vaccine claims, but took few steps to remedy the problem.
In statements to various media outlets, the company has defended itself, saying it dedicates enormous resources to assuring safety on its platform and asserting that much of the information provided to journalists and government officials has been taken out of context.
In a conference call to discuss the company’s quarterly earnings on Monday, Facebook CEO Mark Zuckerberg claimed that recent media coverage is painting a misleading picture of his company.
“Good faith criticism helps us get better,” Zuckerberg said. “But my view is that what we are seeing is a coordinated effort to selectively use leaked documents to paint a false picture of our company. The reality is that we have an open culture, where we encourage discussion and research about our work so we can make progress on many complex issues that are not specific to just us.”
The revelations, as well as unrelated business challenges, mean that Facebook, which also owns Instagram and the messaging service WhatsApp, has a lot of things to worry about in the coming weeks and months. Here are five of the biggest.
A potential SEC investigation
Whistleblower Frances Haugen, a former product manager with the company, delivered thousands of pages of documents to lawmakers and journalists last month, prompting the wave of stories about the company’s practices. But the documents also went to the Securities and Exchange Commission, raising the possibility of a federal investigation of the company.
Haugen claims the documents provide evidence that the company withheld information that might have affected investors’ decisions about purchasing Facebook’s stock. Among other things, she says that the documents show that Facebook knew that its number of actual users — a key measurement of its ability to deliver the advertising it depends on for its profits — was lower than it was reporting.
The SEC has not indicated whether or not it will pursue an investigation into the company, and a securities fraud charge would be difficult to prove, requiring evidence that executives actively and knowingly misled investors. But even an investigation could be harmful to the company’s already bruised corporate image.
In a statement provided to various media, a company spokesperson said, “We make extensive disclosures in our S.E.C. filings about the challenges we face, including user engagement, estimating duplicate and false accounts, and keeping our platform safe from people who want to use it to harm others . . . All of these issues are known and debated extensively in the industry, among academics and in the media. We are confident that our disclosures give investors the information they need to make informed decisions.”
Facebook is already being sued by the Federal Trade Commission (FTC), which claims that between the company’s main site, Instagram, and WhatsApp, Facebook exercises monopoly power in the social media market. The agency is demanding that the three platforms be split up.
Facebook has publicly claimed it does not have monopoly power, but internal documents made available by Haugen demonstrate that the company knows it is overwhelmingly dominant in some areas, potentially handing the FTC additional ammunition as it attempts to persuade a federal judge to break up the company.
Congress doesn’t agree on much these days, but Haugen’s testimony in a hearing last month sparked bipartisan anger at Facebook and Instagram, especially over revelations that the latter has long been aware that its platform is harmful to the mental health of many teenage users, particularly young girls.
Several pieces of legislation have since been introduced, including a proposal to create an “app ratings board” that would set age and content ratings for applications on internet-enabled devices.
Others seek to make social media companies like Facebook liable for harm done by false information circulating on the platform, or to force the company to offer stronger privacy protections and to give users the right to control the spread of content about themselves.
Ramya Krishnan, a staff attorney at the Knight First Amendment Institute and a lecturer in law at Columbia Law School, is one of many academics who have been pushing for lawmakers to require Facebook and other social media platforms to allow researchers and journalists better access to data about their audiences and their engagement.
“We’ve seen increased interest among lawmakers and regulators in expanding the space for research and journalism focused on the platform, reflecting the understanding that in order to effectively regulate the platforms we need to better understand the effect that they are having on society and democracy,” she told VOA.
One of the most striking things about the documents released this week is the amount of anger inside Facebook over the company’s public image. The disclosures include reams of internal messages and other communications in which Facebook employees complain about the company’s unwillingness to police content on the site.
“I’m struggling to match my values to my employment here,” one employee wrote in response to the assault on the U.S. Capitol on January 6, which was partly organized on Facebook. “I came here hoping to effect change and improve society, but all I’ve seen is atrophy and abdication of responsibility.”
The documents show that the company is losing employees — particularly those charged with combating hate speech and misinformation — because they don’t believe their efforts have the support of management.
Last year the Anti-Defamation League organized a campaign to pressure companies to “pause” their advertising on Facebook in protest over its failure to eliminate hateful rhetoric on the platform. In a statement given to VOA, Jonathan A. Greenblatt, the group’s CEO, said it is preparing to do so again.
“Mark Zuckerberg would have you believe that Facebook is doing all it can to address the amplification of hate and disinformation,” Greenblatt said. “Now we know the truth: He was aware it was happening and chose to ignore internal researchers’ recommendations and did nothing about it. So we will do something about it, because literally, lives have been lost and people are being silenced and killed as a direct result of Facebook’s negligence.”
He continued, “We are in talks to decide what the best course of action is to bring about real change at Facebook, whether it’s with policymakers, responsible shareholders, or advertisers,” he said. “But make no mistake: We’ve successfully taken on Facebook’s hate and misinformation machine before, and we aren’t afraid to do it again. It’s time to rein in this rogue company and its harmful products.”