UN experts call for countries to combat AI-enhanced racial profiling
- There is a great risk that (AI technologies will) reproduce and reinforce biases, said Verene Shepherd
- Historical arrest data about a neighbourhood may reflect racially biased policing practices: Shepherd
- Technologies appear to make the problem of racial profiling worse
Artificial programmes like facial recognition and predictive policing can risk the reinforcement of the harmful practice of racial profiling, the UN rights warned on Thursday adding that the countries must do more to combat this.
“There is a great risk that (AI technologies will) reproduce and reinforce biases and aggravate or lead to discriminatory practices,” Jamaican human rights expert Verene Shepherd told AFP.
Racial profiling, where anyone targets someone on the basis of their color or ethnicity, is not new and the technologies, seen as tools for bringing more objectivity and fairness to policing, appear in many places to be making the problem worse.
“Historical arrest data about a neighbourhood may reflect racially biased policing practices,” Shepherd, one of the 18 independent experts who make up the UN Committee on the Elimination of Racial Discrimination (CERD), warned.
She added, “Such data will deepen the risk of over-policing in the same neighbourhood, which in turn may lead to more arrests, creating a dangerous feedback loop.”
CERD on Thursday published guidance on how countries worldwide should work to end racial profiling by law enforcement and raised particular concern over the use of AI algorithms for so-called “predictive policing” and “risk assessment”.
The committee monitors compliance by 182 signatory countries to the International Convention on the Elimination of All Forms of Racial Discrimination.
The systems have been touted to help make better use of limited police budgets, but research suggests it can increase deployments to communities that have already been identified, rightly or wrongly, as high-crime zones.
According to AFP, when AI and algorithms use biased historical data, their profiling predictions will reflect that.
“Bad data in, bad results out,” Shepherd said, adding, “We are concerned about what goes into making those assumptions and those predictions.”
The committee, Shepherd said, “hopes that the intensification and globalisation of Black Lives Matter … and other campaigns calling for attention to discrimination against certain vulnerable groups will help (underline) the importance of the recommendations.”
Related Articles
ADVERTISEMENT