West Virginia Attorney General JB McCuskey is framing his lawsuit against Apple as a defense of children. His press release and the legal complaint itself open with stark accusations, including the claim that Apple internally described itself as “the greatest platform for distributing child porn.” The suit highlights the disparity in reporting of child sexual abuse material (CSAM) to the National Center for Missing and Exploited Children (NCMEC): Google filed 1.47 million reports in 2023, while Apple submitted only 267. However, a closer examination reveals a legal strategy that, if successful, could inadvertently create a safer haven for child predators by undermining the legal foundations for prosecuting them.
The core issue isn’t Apple’s current practices, but the legal precedent McCuskey’s lawsuit seeks to establish. The complaint alleges strict liability for design defect, negligence, public nuisance, and violations of the West Virginia Consumer Credit and Protection Act, demanding Apple implement “effective CSAM detection measures,” specifically referencing systems like PhotoDNA used by Google and Meta, and Apple’s previously abandoned NeuralHash project. The problem is that a court order compelling Apple to implement such scans would likely render any evidence discovered through those scans inadmissible in court, due to Fourth Amendment protections against unreasonable search and seizure.
This isn’t a novel legal argument. As Stanford’s Riana Pfefferkorn explained in detail over a year ago, regarding a similar private lawsuit, the Fourth Amendment applies not just to government actors, but also to private entities acting as agents of the government. If Apple is compelled to scan iCloud for CSAM, it effectively becomes a government agent performing a search without a warrant. Any evidence obtained through that search would then be subject to the exclusionary rule, meaning it would be thrown out in court.
The lawsuit is described as a “first-of-its-kind government lawsuit,” targeting Apple’s “failure to detect and report CSAM on iCloud.” The argument rests on the idea that Apple’s design choices – specifically, its implementation of end-to-end encryption and its decision not to build comprehensive surveillance capabilities into iCloud – constitute a defect. This framing is particularly dangerous, as it suggests that offering strong encryption is itself a tortious act, requiring companies to simultaneously create backdoors for government access. Such a precedent would threaten the security of platforms like Signal and ProtonMail, and any messaging app that prioritizes user privacy.
The complaint also takes issue with Apple’s short-lived NeuralHash project, announced in and shelved by . While McCuskey frames Apple’s abandonment of NeuralHash as evidence of prioritizing brand value over child safety, the security community’s strong reaction at the time stemmed from the inherent risks of building client-side scanning infrastructure. As security researchers pointed out, a system designed to scan for CSAM could easily be repurposed to scan for other types of content, creating a powerful surveillance tool susceptible to abuse.
Apple’s own director of user privacy and child safety, Erik Neuenschwander, articulated these concerns in a letter, warning that such a system would “create new threat vectors for data thieves” and open the door to “a slippery slope of unintended consequences,” including potential surveillance of political activity or religious persecution. He also highlighted the risk of false positives, where innocent users could be wrongly flagged as possessing illegal content.
The complaint further argues that Apple’s privacy policy constitutes a material misrepresentation, claiming it misled West Virginia consumers. However, a review of Apple’s current privacy policy reveals that it doesn’t promise comprehensive CSAM scanning. It states that Apple “may” access personal data in certain circumstances, including to “prescreen or scan uploaded content for potentially illegal content, including child sexual exploitation material.” This language suggests a capability, not a commitment, and provides a degree of legal cover for Apple.
The current legal framework surrounding CSAM reporting is carefully constructed. Federal law requires providers to report CSAM when they find it, but explicitly prohibits them from being forced to search for it. This distinction is crucial, as it preserves the voluntariness of detection and avoids the constitutional pitfalls of compelled searches. The existing system relies on platforms voluntarily detecting and reporting material in a manner that maintains prosecutorial viability. McCuskey’s lawsuit threatens to dismantle this framework.
The involvement of outside private counsel, appointed as “Special Assistant Attorneys General,” raises questions about the motivations behind the lawsuit. It’s possible that the legal theory was prioritized for its settlement potential rather than its constitutional soundness. A quick payout for West Virginia, and a commitment from Apple to voluntary changes, might be the ultimate goal, regardless of the broader legal implications.
As Pfefferkorn noted, the reason this type of lawsuit hadn’t been filed before wasn’t a lack of complaints about Apple’s CSAM detection efforts, but a thorough understanding of the legal risks. The lawsuit risks creating a situation where the very evidence needed to prosecute child predators becomes inadmissible in court, effectively shielding them from justice. While McCuskey may have secured a headline, he has not advanced the cause of child protection. In fact, his actions could ultimately make it harder to hold offenders accountable.
