Skip to main content

Improving Siri’s privacy protections

Improving Siri’s privacy protections
At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.
We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.
How Siri Protects Your Privacy
Siri has been engineered to protect user privacy from the beginning. We focus on doing as much on device as possible, minimizing the amount of data we collect with Siri. When we store Siri data on our servers, we don’t use it to build a marketing profile and we never sell it to anyone. We use Siri data only to improve Siri, and we are constantly developing technologies to make Siri even more private. 
Siri uses as little data as possible to deliver an accurate result. When you ask a question about a sporting event, for example, Siri uses your general location to provide suitable results. But if you ask for the nearest grocery store, more specific location data is used.
If you ask Siri to read your unread messages, Siri simply instructs your device to read aloud your unread messages. The contents of your messages aren’t transmitted to Siri’s servers, because that isn’t necessary to fulfill your request.
Siri uses a random identifier — a long string of letters and numbers associated with a single device — to keep track of data while it’s being processed, rather than tying it to your identity through your Apple ID or phone number — a process that we believe is unique among the digital assistants in use today. For further protection, after six months, the device’s data is disassociated from the random identifier.
In iOS, we offer details on the data Siri accesses, and how we protect your information in the process, in Settings > Siri & Search > About Ask Siri & Privacy.
How Your Data Makes Siri Better
In order for Siri to more accurately complete personalized tasks, it collects and stores certain information from your device. For instance, when Siri encounters an uncommon name, it may use names from your Contacts to make sure it recognizes the name correctly.
Siri also relies on data from your interactions with it. This includes the audio of your request and a computer-generated transcription of it. Apple sometimes uses the audio recording of a request, as well as the transcript, in a machine learning process that “trains” Siri to improve.
Before we suspended grading, our process involved reviewing a small sample of audio from Siri requests — less than 0.2 percent — and their computer-generated transcripts, to measure how well Siri was responding and to improve its reliability. For example, did the user intend to wake Siri? Did Siri hear the request accurately? And did Siri respond appropriately to the request?
Changes We’re Making
As a result of our review, we realize we haven’t been fully living up to our high ideals, and for that we apologize. As we previously announced, we halted the Siri grading program. We plan to resume later this fall when software updates are released to our users — but only after making the following changes:

  • First, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve. 
  • Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time. 
  • Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.

Apple is committed to putting the customer at the center of everything we do, which includes protecting their privacy. We created Siri to help them get things done, faster and easier, without compromising their right to privacy. We are grateful to our users for their passion for Siri, and for pushing us to constantly improve.

For more information: Siri Privacy and Grading
Press Contacts
Cat Franklin,Apple,cfranklin3@apple.com,(669) 276-1209
Apple Media Helpline,media.help@apple.com,(408) 974-2042

Comments

Popular posts from this blog

The facts about parental control apps

The facts about parental control apps Apple has always believed that parents should have tools to manage their children’s device usage. It’s the reason we created, and continue to develop, Screen Time. Other apps in the App Store, including Balance Screen Time by Moment Health and Verizon Smart Family, give parents the power to balance the benefits of technology with other activities that help young minds learn and grow. We recently removed several parental control apps from the App Store, and we did it for a simple reason: they put users’ privacy and security at risk. It’s important to understand why and how this happened. Over the last year, we became aware that several of these parental control apps were using a highly invasive technology called Mobile Device Management, or MDM. MDM gives a third party control and access over a device and its most sensitive information including user location, app use, email accounts, camera permissions...

Apple Design Awards highlight excellence in app and game design

Apple Design Awards highlight excellence in app and game design On Monday, June 4, at the Worldwide Developers Conference, Apple celebrated the Apple Design Awards, recognizing the creative artistry and technical achievements of developers who reflect the best in design, innovation and technology on Apple platforms. This year’s award winners include developers from across the globe, including Australia, Austria, Canada, Denmark, Finland, India, Netherlands, Turkey and the US. “This year’s winning apps and the developers behind them have created some really innovative and inspiring apps,” said Ron Okamoto, Apple’s vice president of Worldwide Developer Relations. “The Apple Design Awards have been a launchpad for developers who’ve made beloved apps on the App Store, like Procreate, Zova, djay Pro and Monument Valley, and we know these winners will continue that tradition.” 2018 Apple Design Award Winners To learn more about and download the apps and games, visit the App Store . Watc...

Apple brings contactless student IDs on iPhone and Apple Watch to more universities

Apple brings contactless student IDs on iPhone and Apple Watch to more universities In the coming school year, more than 100,000 college students will enjoy the ease and convenience of carrying their student IDs on iPhone and Apple Watch. Students at Clemson University, Georgetown University, University of Tennessee, University of Kentucky, University of San Francisco, University of Vermont, Arkansas State University, South Dakota State University, Norfolk State University, Louisburg College, University of North Alabama and Chowan University will soon be able to use their student ID in Apple Wallet to get into dorms, buy lunch and more. “We’re happy to add to the growing number of schools that are making getting around campus easier than ever with iPhone and Apple Watch,” said Jennifer Bailey, Apple’s vice president of Internet Services. “We know students love this feature. Our university partners tell us that since launch, students across the country have purchased 1.25 million ...