/blog/

2021 0810 Slippery Slopes and Bright Lines

Device-side CSAM scanning is a retreat to a strategically weaker position for end-user privacy contra nonconsensual search by law enforcement.

Last week, Apple announced (archive link from 2021-08-10) (local copy of the archive link from that date, retrieved in 2024) new functionality coming to future versions of iOS, iPadOS, and macOS, designed to find CSAM before images are uploaded to iCloud. Some of the central functionality represents a significant strategic loss for privacy, regardless of the excellence of the technical implementation, and I want to explain why I oppose this well-intentioned change.

Details of iCloud Photos scanning

While I do have other concerns, this post is about the client-side scanning of photos before uploading them to iCloud. I will focus on these details of the system in this post:

  • Images are scanned by the operating system on the user’s own device; they are not scanned on the server.
  • Although iPhones are encrypted, the user’s device itself performs the scans, making it possible to scan images which law enforcement could not directly access.
  • If images over a threshold match known CSAM, they are decrypted and sent to Apple, which manually reviews them for false positives, sends true CSAM to NCMEC1, and reports the incident to law enforcement.

A strategic loss

Until this announcement, Apple’s stance was something like this:

  • We do not have the capability to provide encrypted data to law enforcement, ourselves, or anyone else.
  • What’s on a user’s device is private to that user.

This is a clear bright line that was easily understood everyone, even nontechnical users (although some might have preferred a different stance). The clarity and non-ambiguity provided a unique protection from more intrusive surveillance of the public by law enforcement. It was seen as legitimate.

After the announcement, the stance is different:

  • We sometimes build capability to provide law enforcement access to encrypted data
  • What’s on a user’s device is sometimes private to that user

The new stance is much a much weaker position for arguing against other law enforcement demands, because it is much less of a bright line. It is easier to argue that a specific case should be an exception when there is precedent for some other exception. I expect law enforcement to demand capabilities to detect more kinds of content – like terrorist propaganda, hate speech, and copyrighted material – and to increase demands for nonconsensual access to encrypted data.

Apple claims (local archive) that it will not submit to government requests for other kinds of data:

“What happens when other governments ask Apple to use this for other purposes?” [cryptographer Matthew] Green asked. “What’s Apple going to say?”

[Apple privacy chief Erik] Neuenschwander dismissed those concerns, saying that safeguards are in place to prevent abuse of the system and that Apple would reject any such demands from a government.

“We will inform them that we did not build the thing they’re thinking of,” he said.

But that response will be weak. It’s clearly possible, and it’s already been done. Law enforcement will argue that it’s not a burden to add just a bit more capability. And I fear that lawmakers or the public will side with law enforcement.

A wartime analogy

In The Strategy of Conflict, Thomas Schelling describes informal agreements between parties in war. For example:

Gas was not used in World War II. The agreement, though not without antecedents, was largely a tacit one. It is interesting to speculate on whether any alternative agreement concerning poison gas could have been arrived at without formal communication (or even, for that matter, with communication). “Some gas” raises complicated questions of how much, where, under what circumstances: “no gas” is simple and unambiguous. Gas only on military personnel; gas used only by defending forces; gas only when carried by vehicle or projectile; no gas without warning — a variety of limits is conceivable; some may make sense, and many might have been more impartial to the outcome of the war. But there is a simplicity to “no gas” that makes it almost uniquely a focus for agreement when each side can only conjecture at what rules the other side would propose and when failure at coordination on the first try may spoil the chances for acquiescence in any limits at all.

In a conflict, an opposing side will find an all-or-nothing limit more legitimate than a limit somewhere in the middle. The conflict between the public and law enforcement over encryption and private data is no different.

(Points of coordination in a conflict like this are sometimes called Schelling points after Schelling’s work.)

Other vendors

I happen to like Apple products, but I’m open to alternatives. However, Apple has historically been the most private mainstream alternative. Google and Microsoft have always given users less control over what data users can choose not to share than Apple has, and even with Apple’s recent announcement I think that’s still true – at least for now. (All three companies have simply terrible cloud privacy guarantees; each of them hold the decryption keys for the bulk of your data stored on their services.) Other options like a Librem 5, a Pinephone, or an Android device running CalyxOS or GrapheneOS are interesting, but have significant drawbacks.

My biggest fear, though, is that Apple has weakened the strategic position of any future privacy effort with this compromise. If the new norm allows nonconsensual law enforcement access to some encrypted data on end-user devices, and lawmakers and/or the public see protection for law enforcement as illegitimate, Apple has harmed not only my trust in them today, but my privacy permanently.

Historical law enforcement demands

A few short anecdotes. Nothing here is comprehensive or proof beyond a shadow of a doubt, but these events inform my thinking.

  • Daphne Keller, former associate general council at Google, recently tweeted
  • Inside Apple’s Compromises in China: A Times Investigation” (local archive) China does not permit Apple to keep data end to end encrypted, as Apple does for some iCloud data in most countries. Apple built special legal and technical access in China by allowing a third party controlled by the Chinese state to host the data and even hold the encryption keys – keys that Apple does not even hold itself for iCloud accounts in other countries.
  • FBI-Apple encryption dispute, sparked by the San Bernadino terrorism case. The FBI demanded Apple not only give them access to existing tools (some or all of which was granted), but to develop new tools to break its own protection against data loss. The FBI withdrew this demand after it found the capability commercially, so the demand has not been tested to completion in the courts. The Inspector General later investigated claims that the FBI knew it had tools that could access the device but demanded work from Apple anyway, presumably using a hot public issue in a widely covered terrorism case as a legal and social wedge to expand its authority to demand access to encrypted data, but the OIG’s report found no evidence of this.
  • Exclusive: Apple dropped plan for encrypting backups after FBI complained” (local archive). Rueters reports that Apple caved to pressure from the FBI to leave sensitive user data unencrypted and thereby accessible to the FBI. There is no smoking gun in the article, and it’s possible the plan to encrypt backups was dropped for another reason.
  • International Statement: End-To-End Encryption And Public Safety”, (local archive) where heads of law enforcement of seven countries (including the United States) state their belief that “tech companies should include mechanisms in the design of their encrypted products and services whereby governments, acting with appropriate legal authority, can gain access to data in a readable and usable format”. The request is not limited to CSAM, mentioning such material in the same breath as violent crime and terrorist propaganda.

Conclusion

I think this change is a mistake, and I hope Apple not just reverses this course, but commits to a strong stance in opposition to law enforcement access to encrypted data and personal devices.

When standing in opposition to the power of the state, we need a clear, bright line.


  1. The National Center for Missing and Exploited Children, a governrment-adjacent nonprofit and the only entity in the US which can legally possess CSAM. ↩︎

Responses

Webmentions

Hosted on remote sites, and collected here via Webmention.io (thanks!).

Comments

Comments are hosted on this site and powered by Remark42 (thanks!).