In March, as claims about the dangers and ineffectiveness of coronavirus vaccines spun across social media and undermined attempts to stop the spread of the virus, some Facebook employees thought they had found a way to help.

By altering how posts about vaccines are ranked in people’s newsfeeds, researchers at the company realized they could curtail the misleading information individuals saw about COVID-19 vaccines and offer users posts from legitimate sources like the World Health Organization.

Also read: Why Apple once threatened to ban Facebook from its app store

“Given these results, I’m assuming we’re hoping to launch ASAP,” one Facebook employee wrote, responding to the internal memo about the study.

Instead, Facebook shelved some suggestions from the study. Other changes weren’t made until April.

When another Facebook researcher suggested disabling some comments on vaccine posts in March until the platform could do a better job of tackling anti-vaccine messages lurking in them, that proposal was ignored at the time.

Also read: Explained: What are the Facebook Papers?

Critics say the reason Facebook was slow to take action on the ideas is simple: The tech giant worried it might impact the company’s profits.

“Why would you not remove comments? Because engagement is the only thing that matters,” said Imran Ahmed, the CEO of the Center for Countering Digital Hate, an internet watchdog group. “It drives attention and attention equals eyeballs and eyeballs equal ad revenue.”

In an emailed statement, Facebook said it has made “considerable progress” this year with downgrading vaccine misinformation in users’ feeds.

Facebook’s internal discussions were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.

Also read: Facebook making online hate worse, whistleblower Frances Haugen tells UK lawmakers

The trove of documents shows that in the midst of the COVID-19 pandemic, Facebook carefully investigated how its platforms spread misinformation about life-saving vaccines. They also reveal rank-and-file employees regularly suggested solutions for countering anti-vaccine content on the site, to no avail. The Wall Street Journal reported on some of Facebook’s efforts to deal with anti-vaccine comments last month.

Facebook’s response raises questions about whether the company prioritized controversy and division over the health of its users.

“These people are selling fear and outrage,” said Roger McNamee, a Silicon Valley venture capitalist and early investor in Facebook who is now a vocal critic. “It is not a fluke. It is a business model.”

Typically, Facebook ranks posts by engagement — the total number of likes, dislikes, comments, and reshares. That ranking scheme may work well for innocuous subjects like recipes, dog photos, or the latest viral singalong. But Facebook’s own documents show that when it comes to divisive public health issues like vaccines, engagement-based ranking only emphasizes polarization, disagreement, and doubt.