LAUREN O’NEIL—After announcing the use of new child safety features that would detect Child Sexual Abuse Material (“CSAM”) on Apple devices, the world’s largest technology company by revenue faced significant backlash over user privacy. Apple planned to use the new technology to aid the National Center for Missing and Exploited Children (“NCMEC”) and law enforcement in the fight against online child exploitation and distribution of CSAM. The primary concern was not the legitimate purpose of detecting exploitative material but rather the misuse and the overuse of the technology if it fell into the wrong hands.
So, what are these child safety features? First, Apple would scan photos uploaded to iCloud and cross reference the image hashes to known illegal child pornography in the National Center for Missing and Exploited Children’s database. If the technology spots a match, the account is reported to NCMEC. Second, and more controversially, Apple would scan images sent and received in the Messages app for nudity and notify the parent if iCloud Family Sharing is enabled and the user is under the age of 12. Presently, Apple stated it would “take additional time over the coming months… before releasing these critically important child safety features,” but has not announced a concrete plan to implement or pull back on these features.
Many interest groups support Apple’s implementation of the features due its widespread impact and “lifesaving potential for children.” On the other hand, some stakeholders believe Apple’s proposed program is an overstep into protected privacy interests because Apple protected the “the right to communicate privately without … censorship” until this technology was announced. Additionally, they worry about the draconian implications of these features used for a different purpose. Apple characterizes these concerns as a misunderstanding about the limitations on the scanning images only stored in iCloud and only if iCloud Family Sharing is enabled.
Two partners at the boutique firm, Brown Rudnick, represent 34 women who allege that MindGeek, a dominant online pornography company, profited off non-consensual videos taken in the course of human trafficking. Many of the plaintiffs were underage at the time the videos were taken. Prior to this lawsuit, Mastercard, Visa, and Discover blocked credit card purchases on one of MindGeek’s websites amidst allegations that the site depicted child exploitative material and non-consensual conduct. Although MindGeek updated its policies, the websites still do not require consent or age verification of those featured in its videos. Therefore, MindGeek may be complicit in the dissemination of CSAM.
While individual uploaders may face criminal charges, such as non-consensual dissemination of sexual images, the lawsuit demonstrates how private entities can use civil liability to hold human traffickers accountable. On one hand, consumers must balance privacy interests with the legitimate need for CSAM detection on smartphones. On the other hand, there is not much greater of a privacy violation than MindGeek’s dissemination of non-consensual, explicit content for profit that will “haunt” the victims forever.
One solution to this problem might be found in private, non-profit organizations leading the charge and serving as an intermediary between large companies and law enforcement. For example, Thorn, an international anti-human trafficking organization, partners with tech companies, law enforcement, and non-governmental organizations to eliminate CSAM from the internet. Like Apple, Thorn works closely with the National Center for Missing and Exploited Children, and has partnered with organizations, such as Google, Facebook, and Snapchat. In its statement addressing Apple’s decision to pause its child safety feature rollout, Thorn expressed its frustration in acknowledging that Amazon, Google, Facebook, and Snapchat have already joined the fight against child exploitation.
One question remains. Is the privatization of the fight against human trafficking and child exploitation the best way forward? As technology advances, the distribution of CSAM follows. In 2020, the National Center for Missing and Exploited Children received reports of 33.6 million images and 31.6 million videos of CSAM circulating online. Of the reports received, over 98% were from private entities, many of which utilize image hashing and matching technology. These reports are paramount to helping law enforcement agents find and rescue victims of child exploitation and human trafficking. Julie Cordua, CEO of Thorn, framed the issue best when she said, “[Abusers] win when we look at one piece of the puzzle at a time,” because companies’ complicit response to CSAM on their platforms creates an impenetrable barrier for law enforcement. Companies like Apple and MindGeek cannot continue to be complicit if consumers value the eradication of CSAM over abstract privacy interests. So, collaboration between private entities, law enforcement, and intermediaries like NCMEC and Thorn is the best chance to eliminate the distribution of CSAM while safeguarding our fundamental privacy rights.