By: Kristin Houser
July 29, 2019
If you use Apple’s AI-powered voice assistant Siri — or own a Siri-enabled device — there’s a chance a human on the other side of the world may be listening to you have sex right now.
Or they may be hearing that conversation you had with your boss about a new marketing strategy. Or that awkward exchange with your doctor about a really private medical problem.
That’s the takeaway from a troubling new Guardian story in which an Apple whistleblower details how the company lets contractors review audio of users’ Siri commands — as well as recordings never actually meant for Siri’s digital ears — to improve the digital assistant.
To hear Apple tell it, though, the whole review process seems rather innocuous.