WASHINGTON, DC — US lawmakers have been angry at Facebook for years. Since as early as 2011, they have raised alarms about Facebook’s failures to protect users’ privacy, its struggles combating misinformation on its platforms, and its impact on its users’ mental health. But they haven’t passed any new laws addressing those issues.
Now, some key legislators are saying they have the catalyst they need to make real change: whistleblower and former Facebook employee Frances Haugen.
Haugen, once a product manager at the company, testified before the Senate Commerce subcommittee on Consumer Protection, Product Safety, and Data Security on Tuesday in what lawmakers are describing as an urgent call to action to regulate Facebook. The whistleblower prompted a wave of media scrutiny of Facebook when she shared thousands of internal documents with the Wall Street Journal, the SEC, and Congress that show Facebook has known about the harms its products can cause but has downplayed this reality to lawmakers and the public. This proof, which has been missing from the conversation until now, reveals how Facebook conducted research that found its products can cause mental health issues, allow violent content to flourish, and promote polarizing reactions — and then largely ignored that research.
“I came forward because I recognized a frightening truth: Almost no one outside Facebook knows what happens inside Facebook,” said Haugen in her opening testimony on Tuesday.
In a statement in response to Tuesday’s hearing, Facebook’s director of policy communications Lena Pietsch wrote that Haugen “worked for the company for less than two years, had no direct reports, never attended a decision-point meeting with C-level executives — and testified more than six times to not working on the subject matter in question.”
“We don’t agree with her characterization of the many issues she testified about,” wrote Pietsch. “Despite all this, we agree on one thing; it’s time to begin to create standard rules for the internet. It’s been 25 years since the rules for the internet have been updated, and instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act.”
In the past, congressional hearings about Facebook have often descended into political grandstanding, with lawmakers veering off-topic and into their own partisan grievances with the company. Some Republicans have focused on making unproven accusations that the social media company has an anti-conservative bias. At other times, lawmakers have made gaffes that reveal their seeming lack of basic technical knowledge — such as the infamous question by now-retired Sen. Orrin Hatch (R-UT) about how Facebook makes money, or Sen. Richard Blumenthal’s recent question about “Finsta” during a Senate subcommittee hearing last Thursday.
This time, though, lawmakers across the aisle were notably focused and well-studied on the relevant — and tangible — issues at hand. They asked Haugen pointed questions about the harms Facebook can cause, particularly to teenagers and children, and how that can be resolved.
In return, Haugen was an eloquent witness. She broke down complicated topics like Facebook’s algorithmically ranked News Feed in an accessible manner. And she provided some of the clearest explanations yet to both Congress and the public as to what’s wrong with Facebook and how these issues can be fixed.
Haugen repeatedly called for lawmakers to create an outside regulatory agency that would have the power to request data from Facebook, particularly about how its algorithms work and the kind of content they amplify on the company’s social media platforms.
“As long as Facebook is operating in the dark, it is accountable to no one,” said Haugen in her opening testimony. Haugen argued that “a critical starting point for effective regulation is transparency: full access to data for research not directed by Facebook.”
In her written testimony shared ahead of the hearing, Haugen criticized Facebook’s existing quasi-independent oversight board (which has no real legal power over Facebook) because she believes it is “blind” to Facebook’s inner workings.
“Right now, the only people in the world trained to analyze these experiences are people who grew up inside of Facebook or other social media companies,” said Haugen. “There needs to be a regulatory home where someone like me could do a tour of duty after working at a place like this,” she said.
Stanford law professor Nate Persily, who has previously worked directly with Facebook on academic partnerships in the past and who has acknowledged the limitations of those partnerships, recently called for legislation that would compel platforms like Facebook to share internal data with external researchers.
Data transparency isn’t exactly the most attention-grabbing concept, nor is it an easy topic to regulate. But as Recode has previously reported, many leading social media experts agree with Haugen that it’s a first step to meaningfully regulate Facebook.
Facebook’s algorithms power how its platforms work and what everyone sees on their News Feeds. Haugen said these powerful mechanisms shouldn’t operate in a black box that only Facebook controls and understands, and that they must be scrutinized and regulated.
Internal documents that Haugen revealed showed how a 2018 change to Facebook’s News Feed rewarded content that provokes more emotion in people — particularly anger, because it prompts more engagement than any other emotion. Haugen and members of Congress also talked about how Facebook’s algorithms can also push teens toward toxic content, like those promoting eating disorders.
“I have spent most of my career on engagement-based rankings,” said Haugen, who in the past has worked at Google and Pinterest. “Facebook says, ‘We could do it safely because we have AI. The artificial intelligence will find the bad content that we know our engagement-based rankings is promoting,’” she said. But she warned that “Facebook’s own research says they cannot adequately identify” that dangerous content, and that as a result those algorithms are drawing out “extreme sentiment and division” in people.
This, Haugen stressed, is at the core of many of Facebook’s most urgent problems, and it needs oversight from Congress.
“I think [Haugen] has allowed us to get under the hood of Facebook,” said Sen. Ed Markey (D-MA). “We can now see how that company operates and how it is indifferent to the impact the algorithms have on young people in our country.”
Privacy wasn’t one of Haugen’s key focuses during testimony, but several lawmakers, including Sen. Amy Klobuchar (D-MN), Sen. Marsha Blackburn (R-TN), and Sen. Ed Markey (D-MA), brought up the need for better privacy regulation.
Protecting people’s privacy on platforms like Facebook is an area in which Congress has introduced some of the most legislation so far, including updating the 1998 Children’s Online Privacy Protection Act (COPPA), the KIDS Act, which would force tech companies to severely limit targeting advertising at children 16 or younger, and the SAFE DATA Act, which would create user rights to data transparency and ask for opt-in consent for processing sensitive data. So it makes sense why this would be a key part of their potential plans to regulate Facebook.
“Passing a federal privacy standard has been long in the works. I put my first one in 2012 and I think it will be this Congress and this subcommittee that will lead the way,” said Blackburn.
Haugen agreed that how Facebook handles its users’ privacy is a key area of concern that regulators should focus on, but she also said she doesn’t believe privacy regulation is the only solution to mitigating Facebook’s harms to society.
“Facebook wants to trick you into thinking that privacy protections or changes to Section 230 alone will be sufficient,” said Haugen. “While important, they will not get to the core of the issue, which is that no one truly understands the destructive traits of Facebook except for Facebook. We can afford nothing less than full transparency.”
During the hearing, several senators brought up Section 230 — the landmark internet law that shields tech companies from being sued for most kinds of illegal content their users post on their platforms.
Reforming Section 230 would be highly controversial. Even some policy organizations like the Electronic Frontier Foundation and Fight for the Future, which heavily scrutinize tech companies, have argued that stripping this law away could entrench reigning tech giants because it would make it harder for smaller social media platforms with fewer content moderation resources to operate without facing costly lawsuits.
Haugen seemed to understand some of these nuances in her discussion of 230. She proposed for regulators to modify Section 230 to make companies legally liable for their algorithms promoting harmful content rather than specific users’ posts.
“I encourage reforming Section 230 decisions about algorithms. Modifying 230 around content — it gets very complicated because user-generated content is something companies have less control over,” said Haugen. “They have 100 percent control over algorithms.”
The leaders of the Senate subcommittee that brought Haugen to testify on Tuesday said they are going to keep Facebook in the spotlight and that they’ll hold more hearings in the future (they wouldn’t say when) about Facebook and other tech companies.
“She has really gripped the consciousness of Congress today and made a lasting and enduring difference in how we will regard Big Tech,” said Blumenthal. “Without any exaggeration, we are beginning now a different era — I hope it will be different — in holding Big Tech accountable.”
But Congress is still very much in the talking stage. None of the many bills that have been introduced over the years — such as a bill to prevent health misinformation on social media or a proposed antitrust law to prevent major tech companies from selling product lines they control — are remotely close to passing. And while this moment feels different — and some senators, like Ed Markey, have been reintroducing bills in light of the new scrutiny — there’s a battle ahead for lawmakers if they are ready to fight.
Sen. Richard Blumenthal, who co-leads the subcommittee that held the hearing Tuesday, declined to say if he will subpoena Mark Zuckerberg or exactly when the next hearing would be. Sen. Marsha Blackburn, who co-leads with Blumenthal, said that change is coming “sooner rather than later” and that Congress is “close to bipartisan agreement.” But given the reality that Congress is still negotiating basic funding for the US government, trying to regulate Facebook effectively is going to take time as well as some remarkable cross-party coordination.
But the focus senators brought to today’s hearing shows that even this polarized Congress may be ready to unite — at least when it comes to regulating Facebook.