Apple Stops Contractors From Listening To Conversations From Siri

Here's Apple iOS 13 roundup

Last week, The Guardian revealed that Apple has been tuning in to people’s conversations via Siri, to help improve its system. Today, the company announced that they are temporarily stopping the practice and will look into the allegation.

In the report, it detailed that Apple is hiring a separate contractor to listen to snippets of conversations gathered across all Apple devices that have Siri—almost all—to help develop Siri and diction.

The employed contractors classify the quality of Siri’s response and grade how well it dealt with the request. Most of those recordings likely captured people trying to send a text message or sort out their calendar.

In simple terms, real people are used to helping the voice assistant form more natural responses and be better prepared for random questions users throw at it by analyzing a portion of all Siri conversations.

Apple told that a minimal random subset, less than 1% of daily Siri activations are used, and those that are used are typically only a few seconds long.

Furthermore, the company claims that gathered conversations are not linked to Apple IDs or any other information that will directly identify who the person is in a conversation. The company will also hold these recordings for a maximum of 2 years, according to a security document.

However, it was revealed that Apple has also been collecting accidental conversations. This happens when Siri is activated when it mistakenly hears the “wake word,” “Hey Siri.” Interestingly, this happens a lot of times.

In light of that information, an informant from Apple’s contractor decided to reveal the loophole in their process and indicated that there is a potential that this information may prove harmful, especially when nefarious people listening choose to analyze the conversation.

Notably, it is not difficult to decipher who is speaking in the recording based on the information that may be shared within 30-second conversations.

According to the informant, an accidental recording may contain medical information shared between a user and his/her doctor, suspicious criminal undertakings, and/or sexual intercourse.

These recordings are mostly sourced from accidental activations on the Apple Watch and HomePod. Significantly, Apple has 35% of the smartwatch market, more than three times its nearest competitor, Samsung, and more than its next six biggest competitors combined.

If you’re putting together the numbers, that’s a lot of 30-second accidental recordings that random people have access to.

Although Apple shows no interest in these kinds of information, the informant is concerned about the people hearing them and the threat of the whole ordeal on people’s privacy.

Apple, in response to the allegation, has ceased all contractors from this practice and said: “We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”

Another issue cited with Apple’s privacy regulations was the lack of choice between allowing the company to use their recordings, among others.

Ironically, Apple, a company who prides itself because of its user privacy, is caught in between a heated conversation about privacy breaches. In the past, Apple has used this as a competitive advantage against other brands, but it appears that they are all operating the same way.

The news, however, maintains to differentiate itself from other companies because it does not consistently listen to conversations. Only when it hears “Hey Siri” or accidentally does so.

Recently, there have been various reports regarding devices from Amazon that’s able to listen and record conversations even when they are seemingly deactivated that resulted in multiple probes condemning the practice.

Google, on the other hand, was recently banned in Germany. Specifically, its Google Assistant feature following a leak last month of scores of audio snippets collected through the software.

The Hamburg data protection authority told Google of its intention to use Article 66 powers of the General Data Protection Regulation (GDPR) to begin an “urgency procedure” under Article 66 of GDPR last month. 

Article 66 allows a DPA to order data processing to stop if it believes there is “an urgent need to act to protect the rights and freedoms of data subjects.”

Be the first to comment on "Apple Stops Contractors From Listening To Conversations From Siri"

Leave a comment

Your email address will not be published.