Apple Walks Back CSAM Detection System…for Now

Mack DeGeurin
7 min readSep 16, 2021

Well, this is a change of pace. In a techno futurist world often seemingly hellbent on forward movement in spite of collateral damage, it’s rare to see the tides of resistance reel in the surveillance sea. Yet, that’s exactly what happened last week, though as we’ll get into, the change may represent more of an ephemeral reprieve than a codified decree.

As discussed at length in a previous issue of this newsletter, Apple recently revealed its plans to release an image detection tool called NeuralHash which works by scanning (though Apple disagrees with the framing) images on a user’s device and cross-references them against a database of known child sexual abuse media (CSAM) files maintained by the National Center for Missing & Exploited Children.

If enough similarities between the hashes are flagged and if a threshold of 30 images is met, the user’s data is then reviewed by an Apple employee and eventually passed along to law enforcement if necessary. This scanning impacts images being uploaded to iCloud, but the scanning itself takes place locally on a user’s device, a marked…

--

--

Mack DeGeurin

Texas expat, freelance journalist. Work has been featured in New York Magazine, Motherboard and Medium. I’m on Twitter @mackdegeurin