Apple's voice-operated Siri virtual butler has been in hot water lately, due to an initial setting that gives third-party contractors access to accidental voice recordings. Amazon and Google do it with their smart speakers and the virtual Google Assistant or Alexa loaded on them but upon the initial setup they let you uncheck that option.
The reasoning behind having real people snooping on recordings is to improve the voice-controlled assistants' performance and command recognition, of course, nothing nefarious, but having access to personal and what most people would assume, ...
Wednesday, 31 July 2019
GSM ARENA
GSMArena.com
0 Response to How to block Apple contractors from listening on your accidental Siri recordings
Post a Comment