Siri listens to us. And it also makes us listen to companies paid by Apple to figure out what users are asking for. The data, however, it seems, are not anonymous
Not only Amazon and Google but, predictably, also Apple has the habit of making its employees and employees of subcontracted companies listen to audio clips, captured and processed by its digital assistant Siri. This is what the Guardian discovered, talking to a source who wants to remain anonymous.
In the past months and weeks, similar behaviors had emerged from Amazon and Google, in both cases justified by the need to test the effectiveness of the algorithms of "speech to text" and the quality of the answers provided by the digital assistant to the user. In this case, however, the anonymous source specifies that with Apple products there is a large amount of unmotivated activations of the assistant. And, as a result, very often Siri listens to us without us noticing. Even these unsolicited recordings, however, would be sent to third-party companies and made to listen to their employees.
Siri listens to you having sex
According to the source cited by the Guardian, the activation words "Hey Siri" are too similar to too many sounds and other words and expressions of common language. The result is that employees at firms that have taken on contracts to check Siri's effectiveness find themselves listening to a flood of audio clips in which there is everything but a command addressed to Siri.
Car conversations, complete with engine background noise, doctors talking to patients, complete with the patients' own medical histories, even couples having sex. Among the "commands" that employees outside of Apple have to listen to is really anything. A large portion of Siri's misactivations, moreover, would be caused by the Apple Watch: if Apple's smart watch detects that it has been lifted up and hears speech, then Siri instantly activates.
What data Apple employees see
There's a second problem: the data that employees at Apple and its contracted companies are able to see associated with the individual recording. Explains the source, "There have been countless instances of recordings that have where you hear private discussions between doctors and patients, business deals, seemingly criminal activity, sexual encounters, and so on. These recordings are accompanied by user data showing location, contact details, and app data."
Apple, however, says that Apple's Siri data "is not linked to any other data Apple might derive from the use of other Apple services." In fact, according to the company, there is no specific name or identifier attached to a record, and no individual record can be easily linked to other records.