iSurveil

Update: September 8th, 2021 – Bruce Schneier, Fight for the Future, EFF, and OpenMedia delivered more than 59K petition signatures opposing Apple’s spyware plan. For more information, read the press release here.


Apple just announced they’re creating a permanent backdoor into every iOS device for law enforcement.1 

After years of rejecting government pressure to break encryption, and promising us that data on our iOS devices is secure, Apple has caved to the pressure. In iOS 15, Apple is introducing a mandatory backdoor for government agencies and Apple to access all the images we store on our phones that are backed up to iCloud — once certain conditions are met. 

This is something Apple said they would never do. While Apple is making those conditions stringent to start, once the technical backdoor is in place, it isn’t a question of if the government will force them to loosen their conditions of access — it is a question of when.

We need to send a clear message to Apple that they must stick to their promise to us and never give access to the data on our phones to anyone. Sign the petition telling Apple to uphold their privacy promise by protecting the data on our phones!

To: Apple CEO Tim Cook

This petition asks Apple to keep their promise of protecting our privacy and security by not installing a permanent backdoor into iOS devices.

Click here to see petition text

To: Apple CEO Tim Cook

I’m writing to ask you to immediately remove planned iOS 15 features which circumvent encryption and conditionally release access to our private images on our phones. You’ve marketed yourself as a company that respects privacy and I’ve purchased your products on the belief that you’ll stand up for my rights in the face of government pressure.

You’re betraying that promise. Your decision to create a backdoor into my iPhone means that you’ve decided the demands of law enforcement are more important than my right to privacy and security. Despite your promise to use it in only narrow circumstances, the technical infrastructure you’re creating will inevitably be used for wider purposes. Future governments may obligate you to access my private images for purposes other than the ones you’ve described, and hackers or hostile governments may successfully game your system to get access or falsely incriminate people they don’t like.

As a customer and a citizen of democracy, I’m asking you to not introduce your scheme to circumvent encryption and to uphold your promise of protecting my privacy and security.

This campaign is hosted by OpenMedia. We will protect your privacy, and keep you informed about this campaign and others. Find OpenMedia's privacy policy here.

How has Apple changed their position?

Just last year, Apple was standing up to the United States government by rejecting their demand to create a backdoor into our encrypted phones.2 One year later, Apple is giving in to government demands by announcing that they’re creating backdoor into our phones and reversing their position on the importance of the privacy and security of their customers. 

You don’t have to take our word for it. In Apple’s own words:3

  • “Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to us.”
  • “But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.” 
  • “Doing so would hurt the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data.”

Why is Apple doing this? 

Apple has designed a scheme to detect child sexual abuse material (CSAM) on iPhones and other iOS devices. The scheme works by comparing all of the images on your device to an inventory of about 200,000 distorted versions of known CSAM images that will be soon hidden on our iPhones. It will be rolled out in the US first, with other countries expected to follow.

If Apple detects 30 or more matches between the images on a phone and those in the secret database, your account will be disabled and the information will be shared with law enforcement. 

How will this go wrong?

Child abuse is a serious issue that must be confronted — but Apple's scheme is ripe for abuse and problems.4 Governments are already interested in expanding the secret database to include content outside of CSAM. Apple has resisted past demands to unlock our private images and correspondence by pointing out that they can’t do it — their technology didn’t allow them to do it. Now they’re reversing the equation: they are going to be holding the keys, and both democratic and autocratic governments will have the power to compel them to open the door.

We’re already seeing world governments successfully pressure leading tech companies into dramatically changing their policies. For example, India’s recently passed information technology laws require social media companies to regulate the content of their users as desired by the government, or be held responsible for their speech.

Twitter initially resisted the new law, upholding their historical practice of defending freedom of speech and flag misinformation based on their own internal guidelines. Afterocal police raids of Twitter’s Indian offices and threats to imprison Twitter staff, the company has changed their tune and begun censoring Tweets that are critical of the government’s handling of the coronavirus pandemic, and suspending more than 500 accounts associated with the farmers’ protest.6

So it’s not hard to imagine how any number of governments could use existing or new laws to force Apple to change their policy and reveal encrypted content or disable iCloud accounts for political reasons, now that it is technically feasible to do so. All of these legal initiatives could be weaponized by Apple’s change: 

  • The United States is considering repealing Section 230 of the Communications Decency Act, which provides protections for platforms when it comes to hosting and moderating their users’ speech;
  • Canada is considering new online harms legislation that will change the definition of what can and can’t be said online;
  • The UK has published a draft Online Safety Bill that critics say will chill freedom of expression and encourage platforms to over-police speech that they’re hosting.

Apple’s change would also allow this kind of invasive censorship and policing to occur for not just for content that’s being shared publicly, but for everything that’s held “privately” on our phones, cloud storage, and other iOS devices.

There's more. Cryptography experts are also already working out ways in which malicious actors could abuse the system in order to improperly disable accounts and falsely implicate individuals in CSAM distribution. And if the system malfunctions or is wrongly activated, we will lose access to most functions on our phones, an essential lifeline for many, and a repository for our photos, apps, and contacts — our whole digital lives.

What can we do to stop this? 

Apple needs to feel an overwhelming response from the public that this change is dangerous and unwanted. Other companies have already stepped up and said that they would never do what Apple has done because it threatens our privacy and security.7 

If enough people voice their concern about Apple’s complete reversal, the company will be forced to reconsider their choice to give into government pressure by creating a backdoor into our phones. 

The bottom line is that Apple has now unleashed the technical ability to surveil everything on our phones, in our iCloud accounts, and that we receive through iMessage. With this new capability, it is only a matter of time before our privacy and security is completely gone.

Sources

  1. Expanded protections for children, CSAM detection - Apple
  2. Apple refuses government’s request to unlock Pensacola shooting suspect’s iPhones - CNBC
  3. A message to our customers - Apple
  4. Apple’s plan to “think different” about encryption opens a backdoor to your private life - Electronic Frontier Foundation
  5. Twitter reinstates accounts of India’s Rahul Gandhi, other opposition leaders - Reuters
  6. Twitter unblocks Indian politicians’ accounts after suspending for violating disclosure law - The Verge
  7. WhatsApp and privacy experts sound alarm about privacy implications of new Apple photo scanning feature - The Independent

Press: Matt Hatfield | Phone: +1 (888) 441-2640 ext. 1  | [email protected]