Select Language:
Outside supermarkets and busy festival crowds, millions across the UK are now being scanned in real-time using facial recognition technology — the only country in Europe employing this on such a large scale.
During London’s Notting Hill Carnival, where around two million visitors are anticipated to celebrate Afro-Caribbean culture over the weekend, facial recognition cameras are positioned near entrances and exits. The police claim their main goal is to identify and apprehend wanted individuals by scanning faces within large gatherings, cross-referencing them against the police’s extensive suspect database.
Metropolitan Police Chief Mark Rowley described the technology as “a powerful tool that has already led to over 1,000 arrests since early 2024,” highlighting its success in apprehending offenders in high-crime areas. The system was initially tested in 2016, with its usage surging notably over the past three years — over 4.7 million faces were scanned in 2024 alone, according to the NGO Liberty.
Since late January, police have applied the live facial recognition system approximately 100 times, a sharp increase compared to just 10 deployments between 2016 and 2019. Instances include pre-game checks at two Six Nations rugby matches and outside Oasis concerts in Cardiff this July.
When someone on a police watchlist passes by the cameras, the system, often installed in police vans and powered by AI, activates an alert. Officers can then detain the individual immediately after confirming their identity. However, critics argue that this mass street-level data collection, including during King Charles III’s coronation, “treats us like a nation of suspects,” according to Big Brother Watch.
Rebecca Vincent, interim director of the organization, explained, “There’s no legal framework governing this, meaning there are no protections for our rights, and law enforcement is effectively making its own rules.” The use of facial recognition by private entities like supermarkets and clothing stores has also raised concerns. “There’s very little transparency about how the data is stored and used,” Vincent added.
Many stores utilize Facewatch, a service that tracks suspected offenders and alerts staff if they’re identified inside the premises. Daragh Murray, a human rights law lecturer at Queen Mary University of London, warned, “Living under constant surveillance strips away anonymity, impacting our ability to protest and participate in civic and cultural life.” Most people are unaware they’re being profiled while shopping or attending events. Abigail Bevon, a 26-year-old forensic scientist, remarked at a London store, “I was very surprised to learn how widespread this technology is used.” Though she acknowledged its potential police applications, she felt that retailer-led deployment was invasive.
Since February, EU regulations have prohibited real-time facial recognition systems, allowing exceptions mainly for anti-terror activities. Vincent pointed out that beyond a few U.S. cases, “there’s virtually nothing comparable among European democracies.” Interior Minister Yvette Cooper recently promised to develop a legal framework restricting facial recognition to serious crimes. Nonetheless, the police were authorized this month to expand its use into seven new regions, with permanent cameras planned for Croydon, south London, next month.
Authorities insist safeguards are in place, such as turning cameras off when officers aren’t present and deleting biometric data for non-suspects. Yet, the UK’s human rights watchdog declared on Wednesday that the Metropolitan Police’s policies on facial recognition are “unlawful” and violate rights protections.
Eleven organizations, including Human Rights Watch, signed an open letter urging police leaders to halt the use of facial recognition during events like Notting Hill Carnival. They criticized it for unfairly targeting the Afro-Caribbean community and highlighted inherent racial biases embedded within AI systems.
Shaun Thompson, a 39-year-old Black man living in London, recounted being wrongly identified as a criminal by such technology, resulting in his arrest. He has since filed an appeal against the police decision.