How Apple plans on making features smarter while balancing privacy

Terms like "magical," "incredible," "amazing" and even "chamfered edge" have long appeared in the lexicon of Apple keynote events. Here's a new one: "differential privacy."

The words were uttered at Apple's Worldwide Developer Conference (WWDC 2016) by Craig Federighi, senior vice president of software engineering, as he explained the way the company's new iOS 10 software will anticipate your needs and wants. More importantly, he said, the operating system will get smarter without violating your privacy.

"All of this great work in iOS 10 would be meaningless to us if it came at the expense of your privacy," Federighi said during the event, held in San Francisco.

With that, he distinguished between the philosophies of Apple and competitors like Google when it comes to handling your personal data.

The bottom line: Apple has chosen not to use its data to create "profiles," which means it doesn't need access to specific information about you to figure out what you need. Google, on the other hand, tries to better understand its users, and ultimately show them more-relevant ads, by relying on information like photos, email and favorite locations that they send to the company.

There are many reasons Apple prefers its approach. In the case of a cyberattack, hackers would have a harder time collecting data on Apple users since there's less data about individuals that's available to steal.

Apple isn't alone in experimenting with this technology. Google has introduced it into its Chrome browser for specific purposes, and various startups are using it as well. But for Apple, using deferential privacy represents the latest step in its broader fight over our privacy.

In March, the company refused to help the FBI hack into an iPhone belonging to the gunman in the San Bernardino, California, terror attack. The episode pushed Apple to the forefront of the debate over our privacy, and it even inspired the presumptive Republican nominee for US president, Donald Trump, to call for a boycott of the company.

Toward the end of last Monday's keynote address, Federighi said Apple was doubling down on its privacy efforts, offering to encrypt communications for any app running on its products, jumbling the data so only the intended recipient can read it.

"We believe you should have great features and great privacy," he said.

Apple believes privacy can be different

"Differential privacy" isn't something Apple just dreamed up. It's a technology that's been around since the '60s. At its heart, this type of software adds "noise," or random information, as data is being accessed.

Take the onscreen keyboard and its QuickType feature, which promises to correct spellings and recommend words as you're typing. Using differential privacy, Apple will collect information from your device about what you're typing, but it will inject noise while it's being transmitted to Apple's computer systems, so it doesn't know the exact words you used. When the information gets to Apple, it'll be mixed in with millions of other responses, which are then monitored for popular new words, like "gymbership" (a gym membership). Once a new one is identified, Apple can add it to the QuickType dictionary and beam it to your device. So the next time you write "That gymbership costs way too much!" the keyboard will be able to help.

It's perhaps a small distinction to some people, but if done right experts say, this type of data collection can create the same feel as modern technology that's built using all sorts of information about you, your family and anything you do on the web.

"It is a tool for trying to learn statistical information about a population without learning specific information about any particular individual," said Professor Aaron Roth, who co-wrote the book "The Algorithmic Foundations of Differential Privacy" and discussed the technology with Apple.

Apple plans to use this new privacy feature in iOS 10's QuickType keyboard, its new emojis, the Spotlight search function, and its LookUp function for finding dictionary definitions, iTunes store items and Wikipedia entries. Other apps could potentially benefit later.

Rand Hindi, CEO of the artificial-intelligence startup Snips, says this system is more secure than Google's and Facebook's typical approach of storing data in the cloud.

"For most people it sounds counterintuitive, because for many years we have been told that giving away data was the only way to get an intelligent system or advanced data-based services," Hindi said.

Apple is not the only one trying a new approach to privacy and data. In 2014, Google launched a new software called Randomized Aggregatable Privacy-Preserving Ordinal Response, or RAPPOR for short. The technology, used exclusively in the Chrome browser, analyzes what people use as a home page and tries to control malware and viruses that attempt to take over their computers.

Hindi's startup, Snips, has also been trying to keep people's data under wraps. But companies need large pools of users in order for differential privacy to work. For now that means Apple, and Google with its Chrome function, are helping to prove that this technology is viable.

"What matters is that some people are doing it right," Hindi said.

This article originally appeared on CNET.com.

f

We and our partners use cookies to understand how you use our site, improve your experience and serve you personalized content and advertising. Read about how we use cookies in our cookie policy and how you can control them by clicking Manage Settings. By continuing to use this site, you accept these cookies.