How has Apple changed their position?
Just last year, Apple was standing up to the United States government by rejecting their demand to create a backdoor into our encrypted phones.2 One year later, Apple is giving in to government demands by announcing that they’re creating backdoor into our phones and reversing their position on the importance of the privacy and security of their customers.
You don’t have to take our word for it. In Apple’s own words:3
- “Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to us.”
- “But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”
- “Doing so would hurt the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data.”
Why is Apple doing this?
Apple has designed a scheme to detect child sexual abuse material (CSAM) on iPhones and other iOS devices. The scheme works by comparing all of the images on your device to an inventory of about 200,000 distorted versions of known CSAM images that will be soon hidden on our iPhones. It will be rolled out in the US first, with other countries expected to follow.
If Apple detects 30 or more matches between the images on a phone and those in the secret database, your account will be disabled and the information will be shared with law enforcement.
How will this go wrong?
Child abuse is a serious issue that must be confronted — but Apple's scheme is ripe for abuse and problems.4 Governments are already interested in expanding the secret database to include content outside of CSAM. Apple has resisted past demands to unlock our private images and correspondence by pointing out that they can’t do it — their technology didn’t allow them to do it. Now they’re reversing the equation: they are going to be holding the keys, and both democratic and autocratic governments will have the power to compel them to open the door.
We’re already seeing world governments successfully pressure leading tech companies into dramatically changing their policies. For example, India’s recently passed information technology laws require social media companies to regulate the content of their users as desired by the government, or be held responsible for their speech.5
Twitter initially resisted the new law, upholding their historical practice of defending freedom of speech and flag misinformation based on their own internal guidelines. Afterocal police raids of Twitter’s Indian offices and threats to imprison Twitter staff, the company has changed their tune and begun censoring Tweets that are critical of the government’s handling of the coronavirus pandemic, and suspending more than 500 accounts associated with the farmers’ protest.6
So it’s not hard to imagine how any number of governments could use existing or new laws to force Apple to change their policy and reveal encrypted content or disable iCloud accounts for political reasons, now that it is technically feasible to do so. All of these legal initiatives could be weaponized by Apple’s change:
- The United States is considering repealing Section 230 of the Communications Decency Act, which provides protections for platforms when it comes to hosting and moderating their users’ speech;
- Canada is considering new online harms legislation that will change the definition of what can and can’t be said online;
- The UK has published a draft Online Safety Bill that critics say will chill freedom of expression and encourage platforms to over-police speech that they’re hosting.
Apple’s change would also allow this kind of invasive censorship and policing to occur for not just for content that’s being shared publicly, but for everything that’s held “privately” on our phones, cloud storage, and other iOS devices.
There's more. Cryptography experts are also already working out ways in which malicious actors could abuse the system in order to improperly disable accounts and falsely implicate individuals in CSAM distribution. And if the system malfunctions or is wrongly activated, we will lose access to most functions on our phones, an essential lifeline for many, and a repository for our photos, apps, and contacts — our whole digital lives.
What can we do to stop this?
Apple needs to feel an overwhelming response from the public that this change is dangerous and unwanted. Other companies have already stepped up and said that they would never do what Apple has done because it threatens our privacy and security.7
If enough people voice their concern about Apple’s complete reversal, the company will be forced to reconsider their choice to give into government pressure by creating a backdoor into our phones.
The bottom line is that Apple has now unleashed the technical ability to surveil everything on our phones, in our iCloud accounts, and that we receive through iMessage. With this new capability, it is only a matter of time before our privacy and security is completely gone.