Information Commissioner Warns Firms Over ‘Emotional Analysis’ Technologies

Information commissioner warns firms over ‘emotional analysis’ technologies

By:  The Guardian

October 25, 2020

 

The information commissioner has warned companies to steer clear of “emotional analysis” technologies or face fines, because of the “pseudoscientific” nature of the field.

 

It’s the first time the regulator has issued a blanket warning on the ineffectiveness of a new technology, said Stephen Bonner, the deputy commissioner, but one that is justified by the harm that could be caused if companies made meaningful decisions based on meaningless data.

“There’s a lot of investment and engagement around biometric attempts to detect emotion,” he said. Such technologies attempt to infer information about mental states using data such as the shininess of someone’s skin, or fleeting “micro expressions” on their faces.

 

“Unfortunately, these technologies don’t seem to be backed by science,” Bonner said. “That’s quite concerning, because we’re aware of quite a few organisations looking into these technologies as possible ways to make pretty important decisions: to identify whether people might be fraudsters, or whether job applicants are worthy of getting that role. And there doesn’t seem to be any sense that these work.”

 

Read More