The use of advanced technology is gradually increasing in our everyday lives. Many individuals across the country wake up and unlock their phones by looking into their camera or using their fingerprint, check their messages and e-mails within minutes and even have time to ask Alexa for the news and weather forecast before heading out the door.
Despite simplifying many aspects of our lives, these technologies also present an element of danger in regard to privacy. What safeguards are in place to ensure that your phone is securely storing your facial/fingerprint biometrics, or that Alexa is not sharing your conversations with third parties?
The implications on individual’s privacy in a number of different contexts is what fuels the debate into the increased use of Facial Recognition Technology (FRT).
How does FRT work?
Without getting bogged down into too much of the technical detail, FRT is essentially a way of identifying a person by scanning the distinct points of their face (via photograph or video) in order to create a uniquely identifiable biometric map which is then translated into a code. Your personal code is then matched to other codes (i.e to other people’s codes) to see if a match can be found.
It is currently used in a variety of different contexts, such as by border control, mobile phone makers and businesses as a form of secure identification. More controversial however, is the use of the technology use in a policing context. Potential uses of FRT by police and security services might include scanning large crowds in public spaces or events against a criminal watchlist, or shops scanning customers to identify known shoplifters in the area by comparing the live data scans to a database of images.
How widely-used is FRT in the UK?
Silke Carol, director of civil liberties campaign group ‘Big Brother Watch UK’ has described the continuous rollout of FRT technology in our society as an “epidemic of facial recognition in the UK”. Trials have been conducted in various shopping centres and museums around the country, and police forces are pioneering its usage in public. In particular, The Metropolitan Police are rolling it out in various locations across London and South Wales Police (SWP) have already deployed the technology at various sporting events, concerts and shopping areas in Cardiff.
What is the legislation surrounding its usage?
There is currently no formal legislation surrounding the use of FRT in the UK. In a policing context however, it is worth noting that as it involves the processing of biometric data, live facial recognition is caught by the scope of the Data Protection Act 2018 and the GDPR regulations (sensitive processing of biometric data). The increase of FRT in public places also raises further data-ethics and privacy questions; as the vast majority of those scanned will be law-abiding citizens, what happens to their scanned biometric data? One concerned member of the public, supported by civil rights group ‘Liberty’, asked the Courts this very question last year after he had concerns that his image may have been captured through FRT by a police van whilst out shopping in Cardiff. The case was brought to ascertain whether the usage was lawful under the circumstances, bringing together elements of law enforcement, police powers, the processing of personal data (i.e his facial biometrics which are classified as personal data) under GDPR Regulations and Human Rights Act 1998.
In regard to the Human Rights implications, the claimant argued that the use of FRT was a breach of Article 8 (right to personal and family life), as it involved the storing of data relating to the private lives of individuals. When considering the GDPR aspects, it was equally argued that the SWP collected and processed the claimant’s personal data unlawfully. The Court agreed that the Human Rights Act was engaged and that the GDPR was applicable but found that, in both cases, this use of FRT was legally justifiable.
In relation to the Human Rights Act, the Court found that the claimant’s Article 8 rights were subject to sufficient legal controls contained in primary legislation and therefore legally justified, putting emphasis on the fact that the FRT was deployed for a limited time, for a specific purpose, only processed in accordance with the existing watchlist of suspects and all data (including personal data) were deleted immediately after having been processed.
In regard to the GDPR aspect, the Court held the collection and processing of the claimant’s data was lawful as it met the conditions set out in the Data Protection Act 2018 which apply to law enforcement authorities.
For further details, the full case report can be found here.
Although the above case could potentially be construed as giving a green light for police forces to use FRT as they wish going forward, in reality the key lesson is that what is and isn’t lawful will be entirely fact and case specific.
There remain many unanswered privacy-related questions, however, and the Information Commissioner’s on the matter Opinion (the full version of which can be found here) gives some reassurance that it does not consider this decision to be a seen as a ‘blanket authorisation’ to use FRT in all circumstances.
In particular, the Information Commissioner emphasises that the use of FRT by law enforcement will always constitute the processing of ‘sensitive data’ as it involves the processing of biometric data for the purpose of uniquely identifying an individual. As such, the ICO would typically expect to see (at the minimum) a fully prepared ‘Data Protection Impact Assessment’ in accordance with Section 64 of the Data Protection Act 2018, prepared by the data controller prior to deployment, setting out their lawful basis for processing in a sufficiently clear, precise and foreseeable manner to be able to justify the processing before it starts.
FRT has relatively strong support from the general public in a policing context, although opinion relating to its use in the private sector are likely to be very different. Nonetheless, the Information Commissioner’s call for the introduction of a statutory Code of Practice would be welcomed by all and bring greater clarity and consistency to the use of FRT.