Convicted migrants in the United Kingdom might soon be forced to submit facial scans via a smartwatch if the Home Office and Ministry of Justice passes a  new surveillance plan, according to a report from tech news website Gizmodo. 

In 2021, a document outlining the UK Home Office’s data protection impact assessment was obtained by news portal, The Guardian. Within the document, there was a plan outlining how migrants who came under the Satellite Tracking Service would be forced to wear a wrist wearable or a ankle tag to undergo monitoring. Those with watches would be required to check in five times a day and would be cross-referenced against images in the organisations systems using facial recognition software.

Also Read: Australian retail giant pauses facial recognition trial after complaint

In May 2022, the government signed a £6 million contract ($7.2 million) with a tech firm called Buddi Limited to develop the wearables. What’s worse, those migrants who have been assigned the smart watches would also be tracked using their location in real-time. The document further explains that all data gathered by the Home Office would be stored for up to six years, and the Ministry of Justice as well as the police would have complete access to it. 

Predictably, digital rights group are outraged, and for good reason. The lawyer and legal officer for Privacy International Lucie Audibert told Gizmodo that the Home Office has been coming up with “egregious ideas to surveil and control migrants.” According to her, the new plan is “simply cruel” , “degrading”, “unnecessary” and “unlawful.”

Also Read: Microsoft to stop selling AI facial recognition software

The UK government in its National Audit Office report release in June said that it finds the use of electronic monitoring to be a “cost-effective alternative to custody.” Unfortunately, the UK is not the only government that has had something like this in the pipeline.

In the US, the Department of Homeland Security has often cited the same excuse of cost effectiveness to increase its surveillance on citizens, saying that it is targeted at migrants, which in any case runs the risk of amounting to racial profiling. Incidentally, research has shown that facial recognition technology is prone to bias skewing negatively towards certain communities.