Concerns about facial recognition


The use of facial recognition technology advancing with few controls

February 2026

One of the features of the Chinese state is the massive use of facial recognition technology throughout China. It is a vast system with millions of cameras and is used to monitor every movement of its citizens. The system is used to control every citizen and it means no one can move or meet someone without it being observed and logged by the state. It is the penultimate example of the panopticon in a modern setting.

Some are remarkably relaxed about its use in the UK believing claims made that it will be properly controlled and will be used to catch criminals, drug dealers and the like. So innocent people have nothing to fear. Read on …

Right to Privacy

The government’s Biometric Technology Consultation (to close on 12 February) aims to help develop a new legal framework for the use of facial recognition and similar technologies by law enforcement. Despite the landmark UK Court of Appeal ruling in 2020, that found the South Wales Police use of automated facial recognition (AFR) technology was unlawful, police forces in different parts of the country have increased their use of Live Facial Recognition (LFR).  As these systems contain the biometric data of huge numbers of people, some concerns from human rights groups, including Amnesty and Liberty, are briefly summarised here.

The first concern is to ensure that a new legal framework should apply to all use of ‘biometric technologies’ by all law enforcement and other organisations and should be transparent.  Complex mathematical processing is used to identify facial features and generate ‘similarity scores’ but the internal logic of how a match is calculated is hidden from both the police operators and the public.  The authorities cannot explain the specific basis for an intervention nor account for why or if the technology produces biased or inaccurate results.

Second, it is concerning that the use of Facial Recognition Technology (FRT) is becoming widespread and easily accessible, with retail outlets taking on a quasi law enforcement role, aided and supported by the police, and drawing on the same or similar databases.  It is becoming normalised in schools, in commercial and retail settings, with information flowing between sectors and under a patchwork of inconsistent laws which the public does not understand and find almost impossible to challenge.

This was demonstrated in a recent Guardian report (6th February) on the apprehension and removal from Sainsbury’s store in Elephant and Castle of a customer who had been wrongly matched by staff with a photo of a different customer flagged by their Facewatch camera.  In order to prove his innocence he had to apply to the agency using a QR code and submit a photo and a copy of his passport to them before they declared him not on their blacklist.

 The following is a list of factors of concern to human rights groups:

Transparency – to include how and when the technology is being used and the clarity of accessible information about rights.

– Whether biometric data is acquired overtly or covertly.

– Whether it is collected voluntarily or involuntarily. Pervasive monitoring is leading to the normalisation of suspicion less surveillance.

-The subject’s status – whether or not someone is the intended subject of a police investigation or an innocent bystander walking past a camera.

– Who has access to the data and the results.

– The space, context and location of deployment. Expectations of privacy vary significantly between a quiet park and a busy thoroughfare.

– Whether the system is used to make inferences about a person’s internal state, their emotion or intent.

– Whether the interference is demonstrably ‘necessary’ and ‘proportionate’ to a legitimate aim, such as the prevention of serious crime.

– Whether assessments consider the Public Sector Equality Duty (PSED) to ensure the technology does not have a discriminatory impact.

Algorithm bias: whether this performs differently based on race, gender, or age, which could lead to the over-policing of marginalised communities.

Watchlist bias: whether scrutiny of criteria can ensure groups are not disproportionately targeted based on protected characteristics.

Bias in interpretation of identification patterns claiming to provide predictive evidence.

Human Rights groups say that any new regulation should protect privacy and limit data-sharing between public authorities, law enforcement and private companies; and that as the state gains powerful new ways to monitor citizens, a strong and resilient oversight body with true independence is needed to protect human rights and dignity.

Starmer’s wish to see more facial recognition technology


The prime minister wishes to see more facial recognition technology following riots in Southport and elsewhere

August 2024

Following the terrible murder of three little girls and the wounding of eight others in Southport last week, riots have broken out in various parts of the UK. A mixture of extremists and far right groups have assembled outside mosques and refugee centres to engage in violent acts including attacks on the police. These groups have claimed links to asylum seekers and immigrants to the murders and this has led them to take these violent actions. Several people phoning in the the BBC’s Any Answers programme on Saturday 3 August, made claims linking the murders to boat people even though the young man who has been charged is from Cardiff and is not a boat person.

People have been rightly outraged by the high-jacking of the tragic deaths of the three little girls by large numbers of far right groups many of whom travel to the area with the intention of engaging in violence. There is a natural desire to see these people to be identified, arrested and brought to justice. There is great pressure on the new government to ‘do something’ and the Home Secretary and the Prime Minister have made statements. The latter has called for the greater use of facial recognition technology (FR) in the task of identifying ‘thugs’.

We should be very wary of going down this path. It has echoes of Jeremy Bentham’s idea of a panopticon in the nineteenth century: a prison where prisoners could be watched at all times. It fed into the idea of complete control by governments or their agents as a means of social control.

There are several reasons for being wary of introducing more FR. Although it could be used to locate and arrest those involved in the current mayhem or any future outbreaks, it would introduce into the public realm greater powers for the police and politicians. The last decade has seen a number of laws enacted to prevent or seriously limit protests and demonstrations. Britain has had a history of such protests and they have led to improvements in the role of women in society, better housing, an end to slavery and a range of social improvements and rights for ordinary people. This technology would however, give police considerably enhanced powers to clamp down on protests. We may deplore the sometimes extreme actions of the climate protestors, but without such protests, the government would be unlikely to take action of climate change.

To tackle the violence and riots in other words, we would be giving the government enhanced powers over other forms of legitimate protest. There is also the vexed issue of control. The various scandals we have seen in recent years including the Post Office, the biggest one of all, have shown an inability by the vast range of controls, audits, select committees, etc. etc. to exert any kind of realistic control over these organisations. They achieve lives of their own and seem impervious to moral principles or honest dealings. Do we really want to give them yet more technology?

Another objection is that it would give yet more power to the tech giants over our lives. A feature of the riots is how easy it has been for rioters to assemble by using such media as X and Telegram. These are American firms and are more of less completely outside anyone’s control. It was Elon Musk who decided to allow Tommy Robinson back onto X for example. So we have politicians and journalists making speeches, statements and writing opinion pieces, when it was the decision by one man on the other side of the pond which has provided a key weapon for the far right groups. Although Sir Keir is making noises about the tech giants, will the government actually do anything?

Finally, it sees the solution to these problems in technological terms: we have problem, lets install some more kit and problem solved. The issues are much deeper than simply arresting some thugs. Poverty, low wages, poor housing, inflation and a host of other issues have led to groups of people feeling left behind or ignored by the politicians and to an extent the media.

We should think very carefully before giving politicians yet more power to intrude into our lives. Even though it may mean some of the rioters escaping justice. China has this system installed offering the government almost complete control over its citizens. They have 700 million such cameras used widely as a means to monitor its citizens and to repress minorities. Perhaps we should remember the Chinese proverb ‘a journey of a thousand miles begins with just a single step’. Our liberties are fragile and we should be extremely cautious of giving government’s powers to limit them further. It has the power to be the biggest threat to human rights and civil liberties in the UK.

Blog at WordPress.com.

Up ↑