UK Live Facial Recognition Tech Under⁢ Fire for Bias, Lack of Oversight

London‌ — Live facial recognition technology in the United Kingdom faces increasing scrutiny over concerns about⁢ bias, privacy,⁢ and the‌ absence of clear legal guidelines. Critics argue the⁢ technology disproportionately affects women and people of color, exacerbates existing policing disparities, and operates without ‌adequate oversight.

Madeleine Stone, ⁤Senior​ Advocacy officer at Big Brother Watch, a digital rights organization, highlighted the⁤ technology’s discriminatory nature. Independent studies,⁣ Stone saeid, reveal facial recognition algorithms are less accurate for⁣ women and people of color because they are primarily trained ⁢on ⁣white male faces. While performance has improved, ⁢disparities persist.

The⁣ technology’s bias, Stone added, amplifies existing issues within UK policing. reports have shown systemic biases in policing, leading to disproportionate criminalization ⁣of Black ⁣communities.Even a perfectly accurate⁣ algorithm could lead⁢ to discriminatory outcomes if watchlists over-represent people of color, she said.

Deployment patterns raise further concerns. London police have ‌used mobile⁣ units in lower-income areas with larger ‌populations of color. One early deployment occurred during Notting Hill Carnival, London’s celebration of ‌Afro-Caribbean culture, raising concerns about targeted surveillance.

Currently, no specific laws govern the use ⁢of facial recognition in the UK. Police operate under common law powers to prevent crime, a justification ⁢critics deem insufficient for such an intrusive technology. Parliamentary committees have voiced concerns about this legal vacuum,⁣ noting each police force sets its own rules regarding deployment, watchlists, and safeguards.

Big Brother Watch is ⁤actively campaigning against the use of live facial recognition. The organization coordinates parliamentary engagement, public advocacy, and legal⁣ challenges. In 2023, it coordinated a cross-party statement signed by 65 members of parliament, calling for a halt to the technology due to racial bias, legal gaps, and privacy threats.

The organization also monitors deployments, offering‍ legal ⁤support to those wrongly stopped. Stone recounted cases of a pregnant woman‍ arrested for allegedly missing probation and a schoolboy misidentified by the system. These incidents, she said, demonstrate embedded racial bias and the dangers of relying on ​flawed technology.

Big Brother ‍Watch is supporting a legal challenge by Shaun Thompson, a youth worker wrongly flagged by facial recognition.Thompson was surrounded by police officers and detained for 30 minutes despite explaining the ⁢mistake. The organization’s director filmed the incident and is a co-claimant in the case against the Metropolitan Police,⁤ arguing that live facial recognition violates human rights law.

Without confronting discrimination in policing, facial recognition reinforces the injustices it claims to address.

What’s next

While the new Labor government is considering regulations, it remains unclear whether this⁤ will result⁤ in thorough legislation or mere codes of practice. ‍Big Brother Watch advocates for primary legislation that specifies usage parameters, safeguards,‍ and accountability mechanisms.