By: By Geoff White
May 13, 2019
Black and minority ethnic people could be falsely identified and face questioning because police have failed to test how well their systems deal with non-white faces, say campaigners.
At least three chances to assess how well the systems deal with ethnicity were missed over the past five years, the BBC found.
Campaigners said the tech had too many problems to be used widely.
“It must be dropped immediately,” said privacy rights group Big Brother Watch.
Lost lesson
Several UK police forces have been trialling controversial new facial recognition technology, including automated systems which attempt to identify the faces of people in real time as they pass a camera.
Documents from the police, Home Office and university researchers show that police are aware that ethnicity can have an impact on such systems, but have failed on several occasions to test this.
The Home Office said facial recognition can be an “invaluable tool” in fighting crime.