Apple's new privacy effort worth watching

June 22, 2016 by Troy Wolverton, The Mercury News

Modern technology has presented us with something of a Faustian bargain when it comes to our privacy, but Apple thinks we should have another option.

The bargain we've long been offered is the use of all kinds of personalized services, often for free in exchange for handing over some of the most intimate details of our lives - and running the risk that those details will end up in the wrong hands.

Apple thinks it's found a way around this dilemma. The company is promising that it can offer the kinds of useful information that comes from collecting and analyzing large amounts of data - without compromising individual users' privacy - by using a set of techniques that goes under the banner of "differential privacy." If it works the way Apple and its developers say it will, the system could help rebalance the data collection-personal privacy equation.

"We believe you should have great features and great privacy," Craig Federighi, Apple's for software engineering, told the audience at the company's Worldwide Developer Conference in San Francisco earlier this month. "You deserve it and we're dedicated to providing it."

Privacy is not a new point of emphasis for Apple, as the recent brouhaha over iPhone used by one of the San Bernardino, Calif., attackers made clear. But in the next version of iOS, due out this fall, the company plans to add in differential privacy as an additional privacy protection tool.

Long studied in academic circles but only deployed in a handful of commercial applications, the system allows people or companies to glean meaningful information from a set of data while at the same time preventing any particular data point from being connected to individual users.

The way the system does that is by introducing random bits of noise into the data in known amounts. The classic example is a survey in which users are asked a potentially embarrassing question, such as whether they've ever used drugs. Respondents would be instructed to flip a coin without telling the survey givers how it lands. If the coin comes up heads, they are instructed to answer truthfully. If it comes up tails, they are instructed to flip again. On the second flip, if the coin comes up heads, they are asked to answer yes, if it comes up tails, they are asked to answer no - regardless of what's actually true.

Such a system protects individuals privacy, because no one can tell whether any individual is answering truthfully or not. But because the chance of a coin landing heads or tails is a known quantity, researchers can figure out overall the proportion of users that are answering truthfully and filter out the random noise. With a large enough data set, they can get a pretty good sense of what proportion of the population has used drugs without being able to tell if any individual has.

Of course, iPhone users won't literally be flipping coins or answering survey questions. But Apple plans to employ related techniques. For example, the company plans to interject noise into user data on their devices, before any data is sent up to Apple's servers to analyze.

One way it will do that is by "subsampling" user data; instead of collecting every word a user types, for instance, it will collect a subset of them. It also plans to randomly change data. So characters in words you type may be altered so that the words would be unrecognizable.

Differential privacy theorists have recognized that the more data collected on a particular person, the more individualized that data becomes and the more likely it is that the person could be identified. To try to protect against that danger, the system requires a so-called privacy budget, a limit to the amount of data that can be collected on any one person. Apple is including just such a budget, capping the amount of data it collects on customers in a set time period.

Aaron Roth, who co-wrote a book on differential privacy, got a sneak peak at how Apple was implementing its system and came away impressed.

"They employ engineers who understand the mathematics, and they've got a good algorithm," Roth, an associate professor of computer science at the University of Pennsylvania, said.

At least at first, Apple will only use its differential privacy algorithms in a limited way. For example, it will use them to try to identify new words that customers are typing on their devices so that it can add them to the dictionaries of all users. Similarly, it plans to use them to determine the most popular emoji that customers are using in place of words to better figure out which ones to suggest. It also has indicated that it hopes to use differential privacy for other features in the future.

As interesting and promising as differential privacy is, it's not a panacea. It works well with large data sets from large numbers of users and can be used well to determine trends among groups of people. But it fails in small sets of data; either there's too much noise to glean anything useful or there's so little that individuals can be identified.

Security officials might be able to use differential privacy to find patterns of behavior common to terrorists without identifying individuals, but they couldn't use such techniques to zero in on individual suspects. Likewise, Apple likely wouldn't be able to use it to make particularly personalized suggestions for users.

And it remains to be seen how the system will work and how well it will protect users' privacy. So far, Apple hasn't said much about the algorithms it will use or the parameters it will set.

"I'm happy ... that Apple has announced this and seems to be working on it in a way that has a lot of potential," said Nate Cardozo, a senior staff attorney with the Electronic Frontier Foundation, a digital rights advocacy group. "Apple has said the right words at this point, and now we're waiting for the details."

But it's an exciting development. Here's hoping for all of our sakes that Apple gets it right.

Explore further: Apple updates iPhone to address privacy worries


Related Stories

Apple has 'obligation' to protect users: Cook

March 21, 2016

Apple has "an obligation" to protect user data and privacy, chief executive Tim Cook said Monday, reaffirming his stand in a high-profile court showdown with the US government on encryption.

Senator calls for smartphone app privacy policies

May 25, 2011

(AP) -- A key member of the Senate Judiciary Committee is challenging Apple Inc. and Google Inc. to require all developers that make apps for their mobile devices to adopt formal privacy policies.

FBI chief warns of 'costs' to strong encryption

April 26, 2016

FBI Director James Comey called Tuesday for a discussion of the "costs" of strong encryption now that a legal battle over access to the iPhone used by a California attacker has ended.

Recommended for you

Researchers find tweeting in cities lower than expected

February 20, 2018

Studying data from Twitter, University of Illinois researchers found that less people tweet per capita from larger cities than in smaller ones, indicating an unexpected trend that has implications in understanding urban pace ...

Augmented reality takes 3-D printing to next level

February 20, 2018

Cornell researchers are taking 3-D printing and 3-D modeling to a new level by using augmented reality (AR) to allow designers to design in physical space while a robotic arm rapidly prints the work.

What do you get when you cross an airplane with a submarine?

February 15, 2018

Researchers from North Carolina State University have developed the first unmanned, fixed-wing aircraft that is capable of traveling both through the air and under the water – transitioning repeatedly between sky and sea. ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.