For years, Apple has painstakingly built a reputation as an avid privacy expert among data-hungry and growth-seeking tech companies.
In multi-platform ad campaigns, the company told consumers that “what happens on your iPhone stays on your iPhone,” and equated its products with security through slogans like “Privacy. That’s iPhone.”
But experts say that while Apple sets the bar high when it comes to hardware and in some cases software security, the company could do more to prevent user data from ending up in the hands of police and other authorities.
In recent years, US law enforcement agencies have increasingly made use of data collected and stored by technology companies in investigations and prosecutions. Experts and civil liberties advocates have expressed concern about authorities’ extensive access to consumer digital information, warning that it could violate the Fourth Amendment’s protections against unreasonable searches. Those fears have only grown as protected behaviors, such as access to abortion, have become criminalized in many states.
“The more a company like Apple can do to make sure it doesn’t get requests from law enforcement or say they can’t comply by using tools like end-to-end encryption, the better it will be for the company. to be,” said Caitlin Seeley George, the campaigns and director of the digital advocacy group Fight for the Future.Read:Intel Unison Promises to Meld Your Android, iOS Devices With Your PC Experience
Apple gave law enforcement data 90% of the time
According to its own transparency reports, Apple receives thousands of user data requests per year from law enforcement and mostly cooperates with them.
In the first half of 2021, Apple received 7,122 US law enforcement requests for the account information of 22,427 people. According to the company’s most recent transparency report, Apple handed over some level of data in response to 90% of the requests. Of those 7,122 requests, the iPhone maker has disputed or rejected 261 requests.
The company’s positive response rate is broadly in line with, and sometimes slightly higher than, peers like Facebook and Google. However, both companies have documented far more requests from authorities than the iPhone maker.
In the second half of 2021, Facebook received nearly 60,000 law enforcement requests from U.S. authorities and produced data in 88% of cases, according to that company’s most recent transparency report. During that same period, Google received 46,828 law enforcement requests involving more than 100,000 accounts and handed over some level of data in response to more than 80% of the requests, according to the search giant’s transparency report. That’s more than six times the number of law enforcement requests Apple received in a comparable time frame.Read:Forget the iPhone 14, iCloud Storage Tiers Are Long Overdue for an Update
That’s because the amount of data Apple collects about its users pales in comparison to other players in the space, said Jennifer Golbeck, a computer science professor at the University of Maryland. She noted that Apple’s business model relies less on marketing, advertising and user data — data collection operations. “Of course they don’t feel like doing analytics on people’s data in the same way that Google and many other places do,” she said.
Apple’s drafted detailed guidelines that outline exactly what data authorities can get and how they can get it — a level of detail, the company says, that aligns with best practices.
Despite ‘secure’ hardware, iCloud and other services pose risks
But big gaps remain, privacy advocates say.
While iMessages sent between Apple devices are encrypted end-to-end, preventing anyone but the sender and recipient from accessing them, not all information backed up to iCloud, Apple’s cloud server, is the same. encryption level.
“iCloud content, as it exists in the customer’s account” may be turned over to law enforcement in response to a search warrant, Apple’s law enforcement guidelines state. That includes everything from detailed logs of the time, date, and recipient of emails sent in the last 25 days, to “saved photos, documents, contacts, calendars, bookmarks, Safari browsing history, map search history, messages, and more. iOS device backups.” The device backup by itself can contain “photos and videos in camera roll, device settings, app data, iMessage, business chat, SMS and MMS [multimedia messaging service] messages and voicemail,” Apple said.Read:The Samsung Galaxy Z Flip 4 is my perfect vacation phone –
Golbeck is an iPhone user but chooses not to use iCloud because she is concerned about the system’s vulnerability to hacks and law enforcement requests. “I’m one of those people who, if someone asks if they should get an Android or an iPhone, I think, well, the iPhone will be more protective than the Android, but the bar is just really low,” she said.
“[Apple’s] hardware is the most secure on the market,” said Albert Fox Cahn, the founder of the Surveillance Technology Oversight Project, a privacy rights organization. But the company’s policy on iCloud data also worries him: “I have to spend so much time opting out of things that they automatically try to use to improve my life, but really just endanger me.
“As long as Apple continues to limit privacy to a matter of hardware design rather than looking at the full data lifecycle and the full spectrum of government surveillance threats, Apple will fall short,” he argued.
It’s a double standard that was already evident in Apple’s stance in its most high-profile privacy case, the 2015 mass shooting in San Bernardino, California, Cahn said.
At the time, Apple refused to comply with a request from the FBI to create a back door to access the gunman’s locked iPhone. The company argued that a security bypass could be exploited by both hackers and law enforcement in future cases.
But the company said in court files that if the FBI hadn’t changed the phone’s iCloud password, there would be no need to create a backdoor, as all data would have been backed up and therefore available via subpoena. .
In fact, the company said Apple had already provided “all the data it possessed regarding the attackers’ accounts” until then.
“They were pretty clear that they weren’t willing to break into their own iPhones, but they were eager to actually break into the iCloud backup,” Cahn says.
Apple said in a statement it believed privacy was a fundamental human right, arguing that users were always given the option to opt-out when the company collects their data.
“Our products incorporate innovative privacy technologies and techniques designed to minimize how much of your data we — or anyone else — has access to,” said an Apple spokesperson, Trevor Kincaid, adding that the company prides itself on new privacy features such as transparency. when tracking apps. and email privacy protection, which gives users more control over what information is shared with third parties.
“Where possible, data is processed on the device and in many cases we use end-to-end encryption. In cases where Apple collects personal information, we are clear and transparent about it and tell users how their data is being used and how to opt out at any time.”
Apple reviews all legal requests and is obligated to comply when they are valid, Kincaid added, but stressed that the personal information Apple collects is limited initially. For example, the company encrypts all health data and does not collect device location data.
People ‘are hardly aware of what happens to their data’
Meanwhile, privacy organizations such as the Electronic Frontier Foundation (EFF) are urging Apple to implement end-to-end encryption for iCloud backups.
“When we say they’re better than everyone else, it’s more of an indictment of what everyone else is doing, not necessarily that Apple is particularly good,” said EFF staff technologist Erica Portnoy.
Portnoy credits Apple for the default protection of some services like iMessage. “In some ways some default settings could be a little better [than other companies], which is not nothing,” she said. But, she said, messages are only safe if they are sent between iPhones.
“We know that unless messages are encrypted end-to-end, many people could have access to these communications,” said George, whose organization Fight for the Future launched a campaign to encourage Apple and other companies to improve their messaging systems. to secure.
It’s a problem the company can solve by, for example, using a Google-supported messaging system called rich communication services (RCS), George argued. The system isn’t encrypted end-to-end per se, but it supports encryption, unlike SMS and MMS, and would allow Apple to secure messages between iPhones and Androids, she said.
At the Code 2022 tech conference, Apple CEO Tim Cook indicated that the company had no intention of supporting RCS, arguing that users have not said it is a priority. But they “don’t know what RCS is,” George said. “If Apple really doesn’t want to use RCS because it’s coming from Google, they can come up with other solutions to show good faith that they’re protecting people’s messages.”
Kincaid said consumers were not asking for another messaging service because there are many existing encrypted offerings, such as Signal. He also said that Apple is concerned that RCS is not a modern standard or is encrypted by default.
Golbeck, who runs a TikTok channel on privacy, says people are “vastly unaware of what happens to their data” and “think they have some privacy that they don’t.”
“We really don’t want our own devices turned into surveillance tools for the state,” Golbeck said.