BCS, The Chartered Institute for IT, has warned against the rise of a ‘cavalier attitude’ by organisations using ‘flawed’ facial recognition technology to monitor crowds in public spaces, highlighting the need for better safeguards.

Dr Bill Mitchell, Director of Policy at BCS, said there is an unprecedented danger of biometric data being misused, including identity theft, because of a combination of flawed technology and a lack of ethical and rigorous safeguards around how that data is captured, stored and processed. 

The BCS was responding to concerns expressed by IT professionals in consultations the professional body has carried out over the last 18 months, highlighting the severe risks of biometric data misuse. 

The consultations revealed:

  • Poor data governance resulting in companies unable to effectively monitor how data is used, who is using the data, or where duplicates of data are stored, which may result in any unethical practice going undetected.
  • Lack of diversity in product development teams leading to hard-wired unconscious bias in new products or services that are data-dependent.
  • Using incomplete data to incorrectly infer personal characteristics.
  • Allowing data to be improperly shared within organisations.
  • Improperly aggregating data from different sources to infer personal characteristics.
  • Incorrectly cleaning data
  • Incorrectly restructuring data resulting in the wrong data being associated with an individual.
  • Incorrectly merging different data pipelines from third parties.
  • Not conducting proper due diligence to ensure correct provenance of data through the supply chain (which may well be offshored and distributed across different national jurisdictions).
  • Using data analysis methodologies that are invalid in a particular context.
  • Applying analytical models as part of decision-making processes that are poorly tested (including, for example, inappropriate Machine Learning based neural networks).
  • Using invalid anonymisation techniques that do not provide enough protection against deanonymisation.
  • Storing data insecurely so that it is at risk of being misappropriated.

Dr Mitchell said the feedback from these consultations has been quite clear: “Virtually every time, we hear the same alarming worries about data governance practices. This directly links to worries about the current cavalier attitude to facial recognition technology. For instance, misappropriated facial biometric data could lead to opportunities for virtual doppelgängers, and poorly captured biometric data can lead to cases of mistaken identity that can have dire consequences that are hard to correct. Much of the concern has been focused on the immaturity of the technology. An even bigger concern is what your biometric data is used for, or rather misused for, once it’s been captured and added to a database.” 

The concerns raised by the IT profession come after a series of recent revelations about the widespread use of facial recognition technology. This includes the release of a report by Big Brother Watch, a civil liberties and privacy campaigning organisation, that said there is a facial recognition ‘epidemic’ across privately owned sites in the UK. 

Big Brother Watch claimed it had found major property developers, shopping centres, museums, conference centres and casinos using the technology. Also, the Information Commissioners Office, the UK’s privacy watchdog, has opened an investigation into the use of facial recognition cameras in a busy part of central London in Granary Square, close to King’s Cross station.

“All of this should mean we treat facial recognition technology with extreme caution,” said Dr Mitchell, pointing to a July 2019 report by the University of Essex that found ‘significant flaws’ in the way UK police forces have trialled AI-enabled facial recognition technology.

He concluded by asking, “If the police can’t get it to work properly, why should we assume that property developers, museums or music festival organisers can make it work?”