A former Facebook employee told members of Congress Tuesday that the company knows that its platform spreads misinformation and content that harms children but refuses to make changes that could hurt its profits.
Speaking before the Senate Commerce Subcommittee on Consumer Protection, former Facebook data scientist Frances Haugen told lawmakers that new regulations are needed to force Facebook to improve its own platforms. But she stopped short of calling for a breakup of the company, saying it wouldn’t fix existing problems and would instead turn Facebook into a “Frankenstein” that continues to cause harm around the world while a separate Instagram rakes in most advertising dollars.
Also Read: Facebook CEO Mark Zuckerberg loses $7 billion after outage, whistleblower row
Efforts to pass new regulations on social media have failed in the past, but senators said Tuesday that new revelations about Facebook show the time for inaction has passed.
Here are some key takeaways from Tuesday’s hearing.
FACEBOOK KNOWS IT’S CAUSING HARM TO VULNERABLE PEOPLE
Haugen said Facebook knows that vulnerable people are harmed by its systems, from kids who are susceptible to feel bad about their bodies because of Instagram to adults who are more exposed to misinformation after being widowed, divorced or experiencing other forms of isolation such as moving to a new city.
The platform is designed to exploit negative emotions to keep people on the platform, she said.
“They are aware of the side effects of the choices they have made around amplification. They know that algorithmic-based rankings, or engagement-based rankings, keeps you on their sites longer. You have longer sessions, you show up more often, and that makes them more money.”
Also Read: AOC slams Facebook’s outage, says it is ‘destructive to democracy’
THE WHISTLEBLOWER TOUCHED A NERVE
During the hearing, Tennessee Sen. Marsha Blackburn, the committee’s ranking Republican, said she’d just received a text from Facebook spokesperson Andy Stone pointing out that Haugen did not work on child safety or Instagram or research these issues and has no direct knowledge on the topic from her work at Facebook.
Haugen herself made it clear several times that she did not directly work on these issues but based her testimony on the documents she had and her own experience.
But Facebook’s statement emphasize her limited role and relatively short tenure at the company, effectively questioning her expertise and credibility. That didn’t sit well with everyone.
Facebook’s tactic “demonstrates that they don’t have a good answer to all these problems that they’re attacking her on,” said Gautam Hans, a technology law and free speech expert at Vanderbilt University. “I think that says everything about their impotent response here.”
SMALL CHANGES COULD MAKE A BIG DIFFERENCE
Making changes to reduce the spread of misinformation and other harmful content wouldn’t require a wholesale reinvention of social media, Haugen said. One of the simplest changes could be to just organize posts in chronological order instead of letting computers predict what people want to see based on how much engagement — good or bad — it might attract.
Another was to add one more click before users can easily share content, which she said Facebook knows can dramatically reduce misinformation and hate speech.
“A lot of the changes that I’m talking about are not going to make Facebook an unprofitable company, it just won’t be a ludicrously profitable company like it is today,” she said.
She said Facebook won’t make those changes on its own if it might halt growth, even though the company’s own research showed that people use the platform less when they’re exposed to more toxic content.
“One could reason a kinder, friendlier, more collaborative Facebook might actually have more users five years from now, so it’s in everyone’s interest,” she said.
A PEAK INSIDE THE COMPANY
Haugen portrayed Facebook’s corporate environment as so machine-like and driven by metrics that it was hard to hit the brakes on known harms that, if addressed, might dent growth and profits.
She described the company’s famously “flat” organizational philosophy — with few levels of management and an open-floor workplace at its California headquarters that packs nearly the entire staff into one enormous room — as an impediment to the leadership necessary to pull the plug on bad ideas.
She said the company didn’t set out to make a destructive platform, but she noted that CEO Mark Zuckerberg holds considerable power because he controls more than 50% of the voting shares of the company and that letting metrics drive decisions was itself a decision on his part.
Also Read: What caused the massive global Facebook outage on Monday
“In the end, the buck stops with Mark,” she said.
BIPARTISAN OUTRAGE
Democrats and Republicans on the committee said Tuesday’s hearing showed the need for new regulations that would change how Facebook targets users and amplifies content. Such efforts have long failed in Washington, but several senators said Haugen’s testimony might be the catalyst for change.
“Our differences are very minor, or they seem very minor in the face of the revelations that we’ve now seen, so I’m hoping we can move forward,” said Sen. Richard Blumenthal, D-Conn., the panel’s chairman.
Still, Democratic Sen. Amy Klubuchar of Minnesota acknowledged that Facebook and other tech companies wield a lot of power in the nation’s capital, power that has blocked reforms in the past.
“There are lobbyists around every single corner of this building that have been hired by the tech industry,” Klobuchar said. “Facebook and the other tech companies are throwing a bunch of money around this town and people are listening to them.”