https://www.macrumors.com/2019/07/26/siri-human-analysis-voice-recordings/

The contractor who spoke to The Guardian said that "the regularity of accidental triggers on the watch is incredibly high," and that some snippets were up to 30 seconds in length. Employees listening to Siri recordings are encouraged to report accidental activations as a technical problem, but aren't told to report about content.

This is completely believable, and gives the entire story some credence. My stupid watch is always shooting off Siri because I've bent my wrist back to pick up something or similar. Can't imagine what sorts of curses I've sent their way.

Update: They stopped.

Question: Why did they think they needed to listen to "real" requests?

If I ran a company based on privacy, instead of listening in on 0.01% of users' calls to Siri, I'm getting thousands of my employees to dogfood Siri with on-premise iPhones and use those samples to make it better.

Believe me, Siri is still nascent enough you'll find plenty of issues to fix with just that. Just get it to the point that everyone working at Apple is Siri-ing through their days happily and you'll have a heck of a product.

Labels: , ,