Siri: If anyone dares to tease me, I will send their private recordings to Apple contractors
▲Click above Leifeng.com Follow
This means that the recordings transmitted by Siri are likely to contain international secrets, commercial information or criminal transactions.
Text | Linghuo K
Teasing Siri has become an indispensable source of fun for Apple device users. However, this smart assistant that has been played badly by everyone seems to be waiting for an opportunity to "retaliate" against everyone.
Recently, the British newspaper The Guardian reported that Apple's voice assistant Siri would send recordings containing user privacy to international contractors for manual analysis, resulting in the accidental leakage of the privacy of some users.
The Guardian noted that Apple sent a small portion of Siri recordings to contractors around the world to assess whether Siri's responses at the time were human enough.
According to a contractor worker, Apple has set up a special job position for Siri recording analysis to analyze Siri's response capabilities in many scenarios and whether it can actually solve problems for users.
"We were not informed of any privacy regulations and the position did not set any privacy-related rules. This meant that we could receive the recording and listen to the user's private information. "
Some of the Siri recordings sent to contractors involved user privacy, including address, name, sexual behavior and other information.
The editor of Leifeng.com contacted Apple customer service, who said that Apple strictly abides by user privacy regulations and will not disclose personal privacy data. They also expressed uncertainty about the accuracy of the information leaked by the third party.
This answer is exactly the same as Siri's - no such thing.
However, in the privacy documents provided by Apple to consumers, it only states that these recordings are collected to improve Siri's listening and comprehension abilities, and does not mention that they will be handed over to humans for processing.
"What's more worrying is that Siri is not awakened only when you need it. It may be awakened by accidental touches, similar voices, etc., and the conversation scene at that time may be recorded and sent to the contractor. "
Security personnel said that in 2018, British Defense Secretary Gavin Williamson accidentally activated Siri when he mentioned "Syria" during a conference speech. In other scenarios, zipper sounds and signal noise may even become the reason for activating Siri without reason.
This means that the recordings sent by Siri are likely to contain international secrets, commercial information or criminal transactions. Once this information is leaked to contractors, it is difficult to guarantee that it will not be resold again.
In addition, Siri is also integrated into other devices such as Apple Watch and HomePod, which enables Siri to monitor and record in different situations.
The Guardian reported that market research shows that among Apple devices, Apple Watch has the highest probability of false awakening. It is possible to be awakened by false touches when you do anything, and the watch can record 30 seconds of clips, which is enough for the person who gets the recording to know in detail what you did at the time.
Subsequently, Apple officially issued a statement acknowledging the news that Siri collected some user recordings, and said that Siri's need to transmit recordings has nothing to do with Apple ID, and all reviewers are obliged to follow Apple's strict confidentiality requirements.
Apple also claims that the recordings of Siri phone users are kept strictly confidential to improve the quality of the voice assistant service. This is a very small random subset, less than 1% of daily Siri activations, and most recordings last only a few seconds.
The privacy policy states that Apple will collect personal information such as user name, email address, and contact information, but most people do not read the policy carefully and simply choose to agree.
In this regard, Apple believes that the right to privacy is still in the hands of users. If users do not want Siri to record and upload their calls to Apple, they can turn off the Siri function in the "Siri & Search" option in Settings.
Security experts told Leifeng.com that "eavesdropping" by apps or voice assistants is suspected of seriously violating user privacy. Relevant departments should speed up the formulation of relevant regulations and norms, continuously strengthen supervision and management, and fundamentally protect user data and information security.
Reference source: Blackbird
Recommended Reading
▎Dialogue with Yan Shuicheng: Farewell to 360, becoming CTO of Yitu, what am I thinking about?
▎Apple responds to Siri privacy leak: less than 1% of people are recorded; Feng Xin of Baofeng is accused of bribery in the acquisition of MPS; all three iPhones next year will support 5G
"AI Investment Research" will soon launch the complete video of the CCF GAIR 2019 summit and white papers on major theme sessions , including the Robot Frontier Session, Intelligent Transportation Session, Smart City Session, AI Chip Session, AI Finance Session, AI Medical Session, Smart Education Session, etc. "AI Investment Research" members can watch the summit videos and research reports for free throughout the year , scan the code to enter the member page to learn more, or send a private message to teaching assistant Xiao Mu (WeChat: moocmm) for consultation.