Apple’s Privacy Changes

Eliza Fraser
6 min readFeb 10, 2022

Why Apple Is About To Search Your Files — The Daily, August 20, 2021

This past August, I listened to this episode of The Daily and it hasn’t left my mind since. The episode opens with a disturbing announcement — “any platform that allows the sharing of files is going to be absolutely infested with child sexual abuse”.

I won’t summarize every aspect of the episode in detail, mainly because I am hoping whoever is reading may listen to it. But I will give a quick rundown of the episode:

A few years ago, a New York Times investigation found that there were millions upon millions of child abuse images circulating through the most popular social media apps, and every major tech company was implicated. It was evident that the companies knew this was going on, but at that point, most had not made major moves to combat the problem. The investigation sparked a massive reaction, drawing attention to a problem many had not considered. Companies reacted differently, as some were doing more against the issue than others, for example, Facebook reported thousands and thousands of images per year, while Apple reported only a few hundred (this is a very low number, too low to be true).

The week before the podcast was released, Apple announced they were going to do a lot more about this problem. There were three main changes:

The Search + Siri Change: Users who ask Siri or search how to report child sexual abuse will be pointed to appropriate resources. Those who search for child abuse material will be pointed to resources to get help.

Photo displays message iPhones will show is user searches to report child abuse or searches for abuse material.
Photo from Apple.com

The iMessage Change: Parents can turn on a feature on their child’s phone (under 13 years old) that will notify them if their child receives or sends a nude photo. For children aged 13–17, there will be a feature that asks the user if they are sure they’d like to view or send a photo that contains nudity.

Photo displays messages iPhone will display if user under 17 tries to view or send images containing nudity.
Photo from Apple.com

The Photo Change: New software will be introduced that scans photos to see if it matches the database of known images of child abuse. If there are enough matches, apple reports photos to authorities.

The photo change is the topic of the rest of the episode. If you are wondering, how is this different than scanning already? The answer lies within the main qualm critics have with this change. Companies like Facebook and Google do this with photos that have been uploaded to the website. Users know the photos are no longer completely their property. In this case, photos are stored on the user’s devices, without being uploaded to a secondary location, such as Facebook.

This sets a new and to some, terrifying precedent. Apple’s approach started a “firestorm” in

Silicon Valley. Critics, such as privacy advocates and cybersecurity experts, say this sets a precedent: a major tech company created technology that can go into a private device and look at their data and decide whether or not that data breaks the law. Many are concerned about the potential of abuse from governments and companies. The feeling is “now that it’s been created it can’t be put away”.

Some extra notes on the photo scanning process:

  • Apple only does this with photos uploaded to iCloud
  • It does not exist on every device. It will be built into new iPhones and only turned on in the US to start.
  • How are photos compared to photos in the database?

Photos are reduced to hashes (numbers) and compared against a large database of abuse material. If an account has 30 or more matches, an employee will review photos and send them to authorities.

  • Why 30 matches?

Apple says a higher threshold is important: this is to ensure employees aren’t looking at everyone’s photos.

Apple and privacy (so far):

Up to this point, Apple has been committed to privacy. From a marketing and software perspective, privacy was a key aspect of the company. This move was surprising because of their longstanding partnership with protection.

In the US, Apple has pushed back against the government wanting to search individuals’ data several times. In 2015, after the San Bernadino shooting, the FBI wanted access to one of the gunman’s phones, and Apple refused. They explained that their encryption is strong, and they aren’t able to access the data without the password. Apple’s justification was that it would be dangerous because they would no doubt be ordered to use it in many other cases. The argument that Apple is unable to search a user’s data is no longer true. They can only argue that they won’t, as the capable software will now be present on phones.

Apple’s response to the discourse about their actions was one of frustration. They aren’t getting the benefit of the doubt in this situation, and are dealing with an extremely difficult problem. A few days after their initial announcement, an additional safeguard of a third-party auditor was introduced. This change and others like it will likely not appease critics because the issue most raise is that of the precedent.

The episode ends with this question: what is the price that we are willing to pay to protect victims of child abuse material online?

My thoughts:

Sometimes I find that my opinion on user privacy goes down the path of “I have nothing to hide! Let them search everything I have!”. But because I have been learning about the implications and realities of privacy, I am pushing myself to think critically about it.

I think the final question of this episode is really powerful. What is the price that we are willing to pay to protect victims of child abuse material online?

My answer is that I would pay a huge price. The disturbing reality of this investigation made me realize that whatever I keep on my phone is not that private. Reducing child abuse material online is something I would make compromises for. At first glance, I am okay with these changes on my phone.

However, it isn’t that simple. The reality is this software could easily be abused and used to harm users, whether that be from government entities or private companies. It is too easily misused, the examples in the podcasts are very realistic, such as authoritarian governments searching for those speaking against them. The third-party auditor being brought in by Apple was a smart move. This gives users more confidence in Apple and it acts as an extra layer of insurance.

In terms of aspects users can control, I think scanning photos uploaded to the Cloud is something users can manipulate. If they please, users can upload their content to a different storage space and find alternatives to using iCloud. Perhaps in the future, Apple and other companies will focus their efforts on searching shared material for abuse material, not data stored on a device that is not sent anywhere.

After considering these implications, my opinion does not lean toward letting companies search everything I have because I have no illegal content on my phone. I don’t want companies to be able to search through everything I have and be able to potentially sell or share that information. There are too many opportunities for misuse.

I am interested to see where this leads, and the moves other big tech companies make. The precedent has been set, and I think there will be a major shift in user privacy and data protection.

--

--

Eliza Fraser

Computer science student passionate about topics of security and privacy in technology.