Facebook has said the prevalence of hate speech on the social media platform continued to see a decline for the third consecutive quarter.

“We removed 31.5 million pieces of hate speech content from Facebook, compared to 25.2 million in Q1 (March quarter), and 9.8 million from Instagram, up from 6.3 million in Q1…,”Facebook Vice-President Integrity, Guy Rosen, said.

Improvements in proactively detecting hate speech and ranking changes in News Feed have led to a 15-fold increase in hate speech content removal on Facebook and Instagram since the initiative’s launch.

“In Q2 (June quarter), it was 0.05%, or 5 views per 10,000 views, down from 0.05-0.06%, or 5 to 6 views per 10,000 views in Q1,” Rosen said in Facebook’s Community Standards Enforcement Report for the second quarter of 2021.

The percentage of content Facebook took action on before a user reported it stood at over 90% for 12 out of 13 policy areas on Facebook and nine out of 11 on Instagram.

Facebook also listed measures for removing “harmful COVID-19 misinformation and prohibit ads that try to exploit the pandemic for financial gain.”

These include removal of 20 million pieces of content and over 3,000 accounts, pages, and groups for repeatedly violating Facebook’s rules against spreading COVID-19 and vaccine misinformation.

Facebook said it had collaborated with 80 fact-checking organizations in more than 60 languages around the world to issue warnings on more than 190 million pieces of COVID-related content that third-party fact-checking partners rated as “false, partly false, altered or missing context.”

As a result, there had been a considerable decline in “vaccine hesitancy” in countries several countries.

Citing its COVID-19 Trends and Impact Survey data conducted in partnership with Carnegie-Mellon and University of Maryland, Facebook said vaccine acceptance was up by 50% in US, 35% in France, 25% in Indonesia, and 20% in Nigeria.