Apple’s Child Abuse Privacy Inflection Point

Mack DeGeurin
7 min readAug 16, 2021

In a sense, Apple’s recent privacy debacle surrounding the company’s decision to add new child sexual abuse media (CSAM for short) scanner to its products began a long time ago. Though the news made a splash last week, the features were reportedly being designed for months. These new features, which as we’re about to discuss below may fundamentally move the goal post on privacy writ large, may have skirted by largely unnoticed if a John Hopkins professor of cryptography had not leaked the details. This practice of rollout first and ask for forgiveness later is par for the course for Apple according to coworkers and experts I’ve spoken to over the past week.

Explaining Apple’s new CSAM features

Let’s back up for a moment and explain exactly what’s caused the current privacy uproar. Last week, Matthew Green, the aforementioned professor, took to Twitter to sound the alarm over a new image scanning tool Apple was preparing to roll out called neuralMatch.

--

--

Mack DeGeurin

Texas expat, freelance journalist. Work has been featured in New York Magazine, Motherboard and Medium. I’m on Twitter @mackdegeurin