In an hours-long hearing on Tuesday, Facebook whistleblower Frances Haugen told US lawmakers that regulating, rather than breaking up, Facebook could force it to address safety issues highlighted in leaked internal documents.
The former Facebook product manager revealed her identity in a TV interview on Sunday after leaking a series of damaging research reports to the Wall Street Journal last month.
Later on Tuesday, Facebook founder and CEO Mark Zuckerberg broke his silence on the unfolding crisis in a memo posted to his Facebook page, claiming that Haugen's accusations "don't make sense" and that the company cared "deeply" about safety issues.
Here are four key takeaways from her testimony in the latest high-profile Senate hearing on big tech.
Facebook knows its algorithms are hurting people
"Facebook’s products harm children, stoke division and weaken our democracy," Haugen told the Senate Commerce Subcommittee on Consumer Protection.
A fundamental issue is that Facebook's advertising-based business model needs to keep people on its platforms for as long as possible, and the company exploits negative emotions to achieve that, Haugen said.
"They know that algorithmic-based rankings, or engagement-based rankings, keeps you on their sites longer. You have longer sessions, you show up more often, and that makes them more money," she told the committee.
In his Tuesday memo, Zuckerberg refuted the claim that Facebook's algorithm took advantage of negative emotions like anger, saying "The argument that we deliberately push content that makes people angry for profit is deeply illogical".
"We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content," he said.
"The company’s leadership knows how to make Facebook and Instagram safer but won’t make the necessary changes because they have put their astronomical profits before people," Haugen added.
Facebook has structural issues
Haugen also claimed that Facebook has a chronic tendency to under-staff its teams, impacting its ability to effectively monitor and respond to harmful content on its platforms.
“Facebook is stuck in a cycle where it struggles to hire. That causes it to under-staff projects, which causes scandals, which then makes it harder to hire,” Haugen said.
"I worked on the counter-espionage team, and at any given time, our team could only handle a third of the cases we knew about," she told senators.
Facebook's data-driven corporate environment was also identified as contributing to the company's problems. “Mark has built an organisation that is very metrics-driven. It is intended to be flat. There is no unilateral responsibility,” Haugen said.
“The metrics make the decision".
Haugen told the committee that Facebook founder Mark Zuckerberg, who controls over half of voting shares in the company, was ultimately responsible for the way it ran.
"In the end, the buck stops with Mark," she said.
Haugen doesn't want to break up Facebook
During her testimony, Haugen rejected the idea of breaking up the company - an idea mooted by critics and some lawmakers.
Instead, she argued, the company should be compelled to make changes like switching to a chronological newsfeed and prompting users to read an article before they post it.
"Facebook’s internal research says that each one of those small actions dramatically reduces misinformation, hate speech, and violence-inciting content on the platform," she said.
Asked whether such changes would render Facebook unprofitable, Haugen pointed to the company's current high profitability - it made a $29 billion (€25 billion) profit in 2020.
"The changes I’m talking about today wouldn’t make Facebook an unprofitable company,” Haugen told the committee.
"It just wouldn’t be a ludicrously profitable company. People would consume less content on Facebook, but Facebook would still be profitable".
In his Tuesday memo, Zuckerberg reiterated that he was in favour of some regulation of social media.
"Similar to balancing other social issues, I don't believe private companies should make all of the decisions on their own," he wrote.
"We're committed to doing the best work we can, but at some level the right body to assess tradeoffs between social equities is our democratically elected Congress," Zuckerberg said.
The testimony touched a nerve
Facebook representatives attempted to push back on Haugen's testimony in real time, with company spokesman Andy Stone tweeting that she was not involved in child safety or Instagram during her time at the company.
"Just pointing out the fact that Frances Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge of the topic from her work at Facebook," he said.
A statement from the company released after the Senate hearing had ended doubled down on Stone's remark by casting Haugen as inexperienced and ignorant of the subject matter.
"Today, a Senate Commerce subcommittee held a hearing with a former product manager at Facebook who worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives - and testified more than six times to not working on the subject matter in question," a statement from Facebook head of policy communications Lana Pietsch said.
This prompted Samidh Chakrabarti, former leader of the Civic Integrity political misinformation team that Haugen was part of during her time at Facebook, to respond to the company on Twitter.
"I was there for over six years, had numerous direct reports, and led many decision meetings with C-level execs, and I find the perspectives shared on the need for algorithmic regulation, research transparency, and independent oversight to be entirely valid for debate," he said.
"So Facebook, let's dispense with the ad hominem distraction and instead focus on real discussion of the issues at hand and the proposals being brought forth. The public deserves better".