Child Sexual Exploitation #
This chapter contains frank discussion of criminal activity involving minors that many people will find challenging or upsetting. While we will not discuss details or show activities of child abuse, we will discuss real incidents and victims, including victims who ultimately committed suicide. The events described are based in reality, but some names and details have been adapted to protect victims’ identities.
***
Bzzt.
Sara’s heart dropped. It happened every time she heard her phone vibrate for as long as she could remember. She knew it could be him, asking for more. Her waking nightmare had started four years ago when she received an unsolicited message from a stranger who called himself Brian Kil or “BKil”.
*BKil: Hi Sara, I have to ask you something. Kinda important.*
*Sara3000: Who are you? What is your question?*
*BKil: How many guys have you sent dirty pics to ‘cause I have some of you?[^1]*
She wanted to ignore whoever the creep was sending her messages, but then she got scared. What about that photo she had sent to her crush a few weeks back? Middle school was tough enough as it was. If her best friend Tania saw that embarrassing photo of her blowing a kiss it would be social suicide.
Sara3000: What do you want from me? Who are you???
BKil: Send me some nudes or I’ll show the others to your mom. And to Tania. You wouldn’t want that, would you?
Sara3000: How do you know my mom and Tania? Seriously, who are you and how did you get my pictures? I’m only 14. Leave me alone.
BKil: It doesn’t matter who I am if I could ruin you with one click. Send me the pictures and I’ll leave you alone. You disappear, I hit send.
Sara felt she had no choice. Sara reluctantly took off her shirt. She put her camera on a timer so she could cover her breasts with both hands and hide her face, which was filled with tears.
*BKil: Fine. Next time show more. And lose the tears.*
*Sara3000: You said you would leave me alone if I sent the pic\!*
*BKil: lol dumb slut. Next time you’ll show more or I’ll tell your mom.*
For the next four years, BKil coerced Sara into sharing increasingly explicit images and videos by sending her threatening messages. He’d infested every part of her life: Facebook, Twitter, texts. Every time she thought he had disappeared, she’d receive a new message from an unknown user saying it was BKil and demanding she upload another video to Dropbox. She thought about killing herself to make it stop - BKil had even recommended it - but every time she considered death, she thought of her younger sister, Amina.
The abuse eventually tapered off after Sara turned 18. However, unbeknownst to Sara, Bkil had recently sent Amina a message similar to the initial message that he had sent to her. But when BKil threatened to text Amina’s mom about the explicit photos, Amina did not comply with his request. Instead, she decided to tell her mom about it.1 That’s when Amina’s mom called the police. That’s also when she learned that Sara was also being targeted by Bkil. “We didn’t understand how this could happen to our older child. Why didn’t she come forward? Why didn’t she tell us? Why? There were so many things going through my head. But this person master-manipulated her and brainwashed her to think that if she didn’t do what he told her, he would hurt her or our family.”2
The police had been tracking BKil for 21 months. They had received numerous reports of BKil coercing minors into sending him sexually explicit videos and photos by ‘threatening to murder, rape, or kidnap them if they didn’t comply.” In one case, BKil publicly posted naked photos of one of his victims along with a violent threat to her school that led to its temporary closure. Facebook had shared records related to BKil’s account with police, but BKil had used an anonymous email account and Tor network to obscure his IP address. He frequently text messaged with his victims after luring them on Facebook but never used a phone number that could be traced to him or reused the same device to login to both his anonymous and real Facebook accounts.
The FBI told Amina and Sara’s Mom that they had an idea about how they could to catch BKil. But to execute the plan, they would need Sara’s help.3 Based on evidence that the FBI had collected, a judge authorized the FBI to place malware on BKil’s phone to find his true IP address and, through it, his location. The next time Sara received a message from BKil, she worked with the FBI to upload a video to Dropbox that contained malware code. When BKil viewed the video containing the malware, it disclosed the IP address associated with his computer. After receiving the video, he continued as usual with violent messages to Sara, saying he would murder her family.
Figure 1: Exchange between the FBI, pretending to be Sara, and BKil. (Source: Criminal complaint4)
Figure 2: BKil threatening to kill Sara’s family. (Source: Criminal complaint5)
Due to Sara’s courage, law enforcement tracked down BKil, otherwise known as Buster Hernandez, aged 28, and arrested him in his home in Bakersfield, California. Further investigation showed that he had targeted hundreds of minors between 2012 and 2017. In February, 2020, Hernandez pled guilty to 41 charges, including child pornography, threats and extortion, and witness tampering.6
Ignored but Ubiquitous #
Many issues have come and gone in the history of the internet, but the abuse of children has been consistent — and consistently under-addressed. No matter how innocuous and child-friendly technologies may seem, pedophiles will find ways to take advantage of them to exploit children. This phenomenon is particularly pronounced on social platforms.
The unfortunate truth is that any platform that allows people to move files around or allows them to interact with each other—especially through free, lightly authenticated accounts—will encounter the problem of child sexual exploitation.7 This goes well beyond social media photo-sharing platforms. Every platform from Lego to Facebook to Roblox to Google Drive to Fortnite has had to deal with the harsh reality of child harm.8 This problem has only become more serious in the era of Zoom and Google Meet, which allow high-quality streams to be simultaneously transmitted around the world to large audiences. As a result, abusers are now able to create live child sexual abuse streams that involve interactivity with the offenders.
In the years that I have worked in Trust and Safety, online child sexual exploitation, more than any other area of online abuse, has demonstrated to me how far the priorities of the technology industry have drifted from a desire to build products that protect the most vulnerable. Parts of this chapter may be difficult to read, and you should feel free to skip sections that you find disturbing. However, since every internet platform will have to deal with the harsh reality of child harm, it is better to confront that reality head on than to ignore it and hope it will go away (it won’t). By proactively engaging with issues of child safety, companies can have a direct positive impact on people’s lives.
Taxonomy of Digital Child Sexual Exploitation #
Child Sexual Exploitation (CSE) is the umbrella term for all of the horrible sexually-related things that happen to children, both online and offline.9 The many ways that CSE manifests itself online can be placed in two broad, non-exclusive categories of abuse: Child Sexual Exploitation (CSE) and Child Sexual Abuse Material (CSAM).10 CSE is a catch-all phrase for the ways platforms and apps are misused to commit offenses against children, while CSAM refers to what one might usually think of as child pornography: the storage or circulation of any visual depiction of sexually explicit conduct involving a minor.11 CSAM ranges from photos of naked children running around through sprinklers to photos depicting unimaginable abuse. While many laws governing CSAM still refer to it as ‘child pornography’, the latter term has been resoundly rejected by a number of international advocacy groups, in order to avoid any conflation with consensual adult pornography.12 In this chapter, we use terminology that comes from the Luxembourg guidelines, a set of guidelines that attempt to standardize terms across various legal jurisdictions.13
While it is critical for companies to crack down on the creation and distribution of CSAM on their platforms, CSAM is a small piece of a complex ecosystem of technologically-facilitated child abuse. Vic Baines, a colleague of mine at Facebook who previously worked for Europol and the National Crime Agency in the UK, has written extensively on the topic. In one of her papers, she offers a taxonomy of the different ways that child exploitation manifests itself online and describes the interrelated activities facilitated by the internet that harm children: online grooming; travelling child sex offending (sex tourism); live streaming; sexual extortion; self-generated sexual content; contact abuse; and CSAM production, distribution, and possession.14
Figure 3: The many interrelated ways that CSE manifests itself online.15
Vic also distinguishes between CSAM production and CSAM distribution. CSAM production occurs in two primary ways. Abuse is either recorded offline and then distributed online, or content is solicited or coerced online from children. Users distribute CSAM via social platforms, peer-to-peer file sharing, and on the darknet. Distribution encompasses both commercial or non-commercial networks, and the business models of CSAM distributors continually change.
Many of these forms of abuse can feed into one another. For example, live streaming is by definition always connected to a contact offense, where a child is physically abused. Online grooming and solicitation, where adults reach out to children and build a relationship, is often used as a gateway to sextortion, in which the child creates sexual abuse material for the adult. Sometimes, online relationships evolve to the point where offenders meet their victims in real life, transferring the abuse to the non-virtual world. This can lead to kidnapping and trafficking, where live streaming then serves as a way to advertise children to potential abusers. For instance, when I worked at Yahoo, Sean Zadig and David Oxley led an investigation into a child sex-trafficking ring that was headquartered in Manila, Philippines, but that involved over a thousand individuals worldwide. While live streaming was one component, these offenders made most of their money from live sex tourism in Manila, where people would travel to assault the children who were advertised.
Sean and David tracked hundreds of Yahoo mail and Yahoo messenger accounts that were involved in the marketing, buying, and selling of child sexual abuse material. In the end, their investigation led to the rescue of dozens of children in the Philippines and the identification of over 1000 perpetrators worldwide, including 250 individuals who were traveling to the Philippines to abuse children in person.
More than “Just Images”: Impacts of CSAM on Victims #
In a statement that would end his career, Tom Flanagan, a former senior advisor to the Canadian Prime Minister, once said, “I certainly have no sympathy for child molestors, but I do have some grave doubts about putting people in jail because of their taste in pictures.”16 The sentiment underlying Flanagan’s statement is that the act of looking at a photograph or video, in contrast to the performance of an act of molestation, does not actually harm anyone. That sentiment, as many Canadian citizens recognized, is completely misguided because it ignores the horrific impact of CSAM on victims.17
If you remember one thing from this chapter, remember this: Every piece of CSAM can be linked to the exploitation of a child who was coerced or victimized. Far from innocuous bad taste in images, saving and sharing CSAM extends the worst moment of victims’ lives. Furthermore, viewing and sharing CSAM creates demand for more CSAM, which leads to further exploitation of children.
When Matthew Fanning, a retired New York City police officer, was sentenced to 10 years in prison for downloading two videos of young girls being raped by adult males, he claimed he “would never do anything to hurt anyone.”18 But, of course, his victims felt the pain. Fanning had downloaded a popular CSAM video of “Vicky,” (a pseudonym), whose father had repeatedly raped her starting at age 10, videotaped the abuse, and distributed it online. While in her 20s, Vicky described her experience of revictimization in a victim-impact statement:
“I wonder if the men I pass in the grocery store have seen them. Because the most intimate parts of me are being viewed by thousands of strangers, and traded around, I feel out of control. They are trading my trauma around like treats at a party, but it is far from innocent. It feels like I am being raped by each and every one of them.”19
She further described “enduring flashbacks, nightmares, and paranoia,”20 and explained that her recurring panic attacks had led her to withdraw from college. Like many re-victimized by their own CSAM, Vicky is notified every time her images appear at the center of a new case. Due to legal developments in the 90s, the notification allows Vicky to seek damages for counseling, lost wages, educational costs, and evidence gathering. The notification also affords her an opportunity to submit a victim impact statement. While this can be a beneficial process for some, others prefer not to be reminded and re-victimized as their images are shared forever. Vicky ultimately requested that she no longer receive updates that her images were still being circulated; the barrage of notifications was too much to bear.
The Lowest Common Denominator of Badness Online #
One of the biggest differences between CSE and other types of abuse we’ve talked about in the book is that nearly everyone agrees about what constitutes CSE and everyone agrees that it is bad. This difference is one reason that, as we will outline later in the chapter, the technology industry has produced significantly more advanced systems for dealing with CSE than for dealing with any other type of abuse.21 CSAM is illegal in virtually every country in the world, and while the definitions still vary somewhat between jurisdictions, they have slowly converged over time.
Due to heroic efforts by children’s rights activists, recent years have witnessed incredible growth in awareness about CSAM, as well as increased legislation protecting children globally. Since 2006, the International Center for Missing and Exploited Children (ICMEC) has tracked the presence — and absence — of anti-CSAM laws in 196 countries. When ICMEC started tracking, only 27 countries had adequate legislation to combat CSAM. ICMEC’s latest report in 2023 found that number had grown to 138 countries; only 10 countries lacked legislation related to CSAM.22 This uniformity allows companies to avoid, to a large extent, the kind of complexities that result, for instance, from trying to determine whether a certain piece of content qualifies as hate speech. Companies can create rules and enforce laws around the world that are reasonably consistent.
Nevertheless, despite the widespread agreement that CSAM ought not exist, it remains a persistent problem. Even in countries with well-functioning legal systems, governments struggle to enforce existing laws due to the massive scale of the problem. In 2023, the national clearinghouse that receives reports of CSAM from electronic service providers received 36.2 million reports, most of which involved the circulation of CSAM., 92 percent of which related to international users.23 Many of these reports were sent to international law enforcement in countries with varying government capacity, rule of law, and access to social services. And even while legislation globally has evolved to protect children, technology has continued to outpace it, creating new outlets and tools for offenders to abuse children.
A History of Digital Child Sexual Exploitation #
The exchange of CSAM long predates the internet. But the ease with which people can now create, edit, and trade it has revolutionized and massively increased the problem. People no longer need to go to dark rooms to develop their own film (they can simply use their iphone); no longer need to physically move material across borders (they can transfer it electronically); no longer need to pay in cash (they can use bitcoin); no longer need to engage in risky meet-ups with like-minded perpetrators (they can connect with other offenders in anonymous chat rooms); and no longer even need access to existing CSAM (they can make it themselves using some generative-AI models). All of the technologies that have made parts of our lives easier online have also made participating in the longstanding CSAM ecosystem easier.
CSAM trading first arrived online in the early bulletin board system (BBS) era of the 1980s. These bulletin boards allowed many users (such as myself) to connect with users across the country who shared their interests. Unfortunately, this included users looking for people with whom to share less wholesome interests. In the same way that I could connect with people in chats and forums and download files (slowly) related to cyberspace and hacking, pedophiles used BBS chats to exchange illicit images of minors.
BBS also resulted in the first criminal prosecution of computer pornography, United States v. Thomas. In 1993 David Dirmeyer, a Memphis Postal Inspector who led child pornography investigations, received a tip from hacker Earl Crawley regarding an electronic pornography store on an Amateur Action bulletin board system (AABBS). Dirmeyer went undercover to investigate, downloading $55 worth of images that included bestiality, BDSM, and CSAM. Robert Thomas and his wife Carleen, who were operating the store out of California, were charged with interstate transport of obscene materials.24 This was the first time that transporting CSAM electronically across state lines was treated as legally equivalent to physically mailing illegal material.25
Even though BBS and Usenet provided forums for exchange and a platform for pedophiles to normalize their behavior, technical limitations reduced the impact of BBS-enabled CSE. Collections could only be created from low-quality images rather than videos, downloading images was incredibly expensive, and even if one could afford the price of downloading imagery, storage space was limited to floppy disks. Moreover, while the ability to create anonymous handles gave many users the perceptions of privacy, they were still using their home phones for transactions and, in the event of a purchase, shared their credit card information to pay. Likewise, providers were required to register with a phone number. In short, the technical limitations made it both more difficult to share CSAM and much easier for law enforcement to crack down on activity.
Around this time, notable child abductions of the early 1980s impelled Congress to pass the Missing Children’s Assistance Act.26 Shortly afterward, U.S. President Ronald Reagan joined Adam Walsh’s parents Revé and John Walsh and other children’s advocates to establish the National Center for Missing and Exploited Children (NCMEC). NCMEC collaborates with various stakeholders, ranging from families to private industry to law enforcement, to provide services to deter and combat the sexual exploitation of children. It also serves as a national clearinghouse for CSAM, which is otherwise illegal to hold.27 Though NCMEC is a private non-profit institution, it enjoys quasi-governmental privileges, as laid out in two authorizing statutes that mandate its collaboration with federal, state, and local law enforcement.28
The internet and the World Wide Web globalized CSAM and CSE. In 1993, with the advent of the Mosaic web browser (the first wide-spread graphical portal to the Web), anyone could create a picture gallery, forum, or file sharing site, including those who wanted to share child pornography. Meanwhile, hard drives evolved to enable bigger image collections. In addition to opening up additional gateways for finding imagery, this era saw the first substantial appearance of video CSE.
The other important technological revolution that happened during this era was the rise of digital photography. If a child abuser wanted to create CSE in the 1960s using an eight or sixteen millimeter camera, he would either need to develop the film himself or run the risk of being caught by outsourcing development to a third party. Photo companies during this era would often find CSAM while developing people’s photos and call the cops, who would be there to arrest the person when they showed up to pick up their film. With the explosion of digital filmmaking devices in the late 90s and early 2000s–from big expensive digital single-lens reflex cameras (DSLR) to the eventual integration of cameras into phones–the task of creating CSAM became much cheaper and less risky for perpetrators.
During this same period, peer-to-peer platforms, which allowed users to share files directly rather than download them from a central server, made it much easier for users to distribute CSAM. Platforms like Scour and Gnutella allowed users to directly share CSAM with one another; however, those early peer-to-peer technologies provided no privacy at all, as people advertised their files to everybody on the network under their real IP addresses. As soon as law enforcement realized that CSAM was being shared on these networks, they had little difficulty identifying and locating the distributors.
This started to change in the later 2000s with the invention of TOR hidden services and other dark net technologies that allowed people to trade images in a way that is cryptographically secure.29 This development was compounded by the advent of Bitcoin and other cryptocurrencies, which facilitated anonymous financial transactions, resulting in an increase in CSE-related commercial transactions. In some DarkNet spaces, new CSAM became a currency in and of itself.
Richard Huckle, a man sometimes called “Britain’s worst ever pedophile,”30 ushered in a new era of sexual abuse with his use of Bitcoin and Tor. Between 2006 and 2014, Huckle posed as an aid worker in Kuala Lumpur, Malaysia, grooming and abusing over 200 children, ages 6 months to 12 years old. He uploaded photos and videos of his exploits on the Dark Web for a mutually-reinforcing pedophile network and exchanged access to gruesome content for Bitcoin.31 Worse, he prepared a pedophile manual, “Paedophiles and Poverty: Child Lover Guide,” that encouraged child molesters to target impoverished kids, who were “much easier to seduce than middle-class kids” and provided tips about how to avoid detection.32 Exploiting power dynamics in the Global South is a growing trend among sex offenders, especially as millions gain access to mobile phones.
All of a sudden, nearly everyone in the developed world is carrying an entire digital darkroom in their pocket, as well as the means to share that content that is encrypted end-to-end. And that is where we are now, a place that would have been totally unrecognizable to people even in the early-to-mid 2000s. The technologies that we have built up, which offer significant advances in security and safety, have also made CSAM trading much safer, because every single step of it occurs on one single device.
Table 1: Number of CyberTipline Reports received by NCMEC between 1998 and 2023. (Source: NCMEC33)
Unhindered by geographic boundaries and empowered by encryption and fast internet, offenders trade and share CSAM at alarmingly increasing rates. Reports of CSAM and CSE have skyrocketed over the past decade. In 2023, there were more than 36 million reports to the National Center for Missing and Exploited Children (NCMEC)’s CyberTipline, compared to one million just six years earlier. Part of this increase in reporting is due to the increased capacity of online service providers to detect and report CSAM–between 2017 and 2018, companies began extensively using hashing technology such as photoDNA (see below)–but there is little doubt that the problem has grown worse in recent years.34
Platforms face additional challenges as both adults and children use generative AI models to create CSAM – often based on children they know or non-AI CSAM that has been circulating for decades - and then share this content on social media. There are also recent trends of children using mainstream social media platforms to facilitate the sale of self-generated CSAM.35
Social Media, Live Stream Abuse, and Video Games #
Modern day perpetrators leverage multiple technologies and platforms to take advantage of children, commonly using social or gaming platforms to make initial contact with minors, after which they encourage them to move to more secure messaging platforms. Since MySpace reached one million active users in 2004, social media use globally has skyrocketed, providing pedophiles direct access to children at scale. Facebook’s user base, for example, increased from 1.5% to 30% of the global population between 2004 and 2018.36 Children may be groomed to share sexual images or videos they have taken themselves, which can rapidly develop into a sextortion or blackmail scenario. In 2016, the Justice Department identified sextortion as “by far the most significantly growing threat to children.”37 In the early 2020s, financial sextortion originating in West Africa became increasingly common. Financial sextortion has been transformed by AI. Criminals no longer need to invest the time and energy in developing fake romantic relationships to trick children into sending nude images of video. Rather, they could send an initial message that included an AI nude image of the target, and threaten to share that image with friends and family unless the child sent money. While the child knew the image is fake, they may fear that those they know would not be able to tell the difference.
Figure 4: Social media users over time. (Source: Our World in Data38)
More recently, live-streaming, in which offenders stream sexual abuse of children using webcams so that others can watch in realtime, has become an established reality.39 Apple’s FaceTime, Facebook, Omegle, Skype, and Zoom have all dealt with CSE offenses,40 and in 2023 NCMEC received 11.2 million unique videos through the CyberTipline from electronic service providers.41
Table 2: In 2021 electronic service providers submitted about 29 million reports. In 2023 that number was almost 36 million. (Source: NCMEC42)
Livestream abuse is especially difficult to stop, since the content is gone as soon as the abuser shuts off the camera. To catch live stream abuse, officers often have to go undercover, as Janelle Blackadar, a detective with the Toronto police, did in 2015 as part of a 3-year international investigation called “Project Mercury.”43 Posing as a consumer of CSE, Blackadar joined a Zoom call in which a Philadelphia man violently sexually assaulted a 6-year-old boy while viewers watched and encouraged him. Authorities rescued the boy the next day and arrested 14 men across the United States.
As video games become increasingly social, involving both in-game communication and communication on adjacent platforms, predators have begun using them as access points to young people. Mary Anne Franks, a law professor and president of the Cyber Civil Rights Initiative, calls these virtual spaces “hunting grounds.” Some sites acknowledge the risk of exploitation explicitly: Omegle, a chat app with the headline “Talk to Strangers!” has a warning that reads “Predators have been known to use Omegle, so be careful.”44 Predators spark conversations with children to build trust, sometimes pretending to be children themselves. Once they convince children to share one photo, they blackmail them for even more imagery. When law enforcement in New Jersey decided to crack down on the issue by going undercover as children to find predators in 2018, they arrested 24 people in less than a week. The offenders had used platforms such as Fortnite, Minecraft, and Roblox to connect with undercover police posing as children before migrating to secure chat apps and often arranging to meet in person.45
Modern Responses to Known CSAM #
The response to the exchange of CSAM involves the most standardized and coordinated automated detection mechanism of abuse across the large platforms. Hundreds of companies scan for child sexual abuse material in concert with the National Center for Missing and Exploited Children (NCMEC).
U.S. Legal Framework #
Unlike some of the forms of abuse we’ve discussed so far in this book, CSE content is unambiguously illegal. In the United States, the relevant statue effectively states that electronic service providers that are aware of child sexual abuse material are required not only to remove it, but also to report it to NCMEC.46 What is missing, however, is a requirement to look for it. For good legal reasons related to the 4th amendment, there is no requirement to detect or scan,47 a fact which the statue states explicitly. So there is no affirmative responsibility to look for CSAM, but once a company is aware of it, the company is required to report it.
What does it mean to “become aware” of this sort of content? Companies can become aware either because content is reported or due to proactive scanning and monitoring activities. The level of confidence necessary to meet the standard in both situations can been legally ambiguous and the source of risk.
Importantly, these removal and reporting requirements override any kind of immunity, including that granted under Section 230. While Section 230 provides intermediaries with immunity from civil liability (lawsuits initiated by private persons) for thirty-party content, it does not provide immunity from criminal liability.
Where Do Companies Report? #
In the United States, companies are required to report CSAM to NCMEC through a reporting mechanism called the CyberTipline. The tipline, which was created in 1998 in response to rising reports about the exchange of CSAM on the internet, is an electronic reporting system that can be accessed directly through NCMEC’s website. While members of the public can fill out a form to report an individual incident, the vast majority of reporting comes from mandatory reporting by Electronic Service Providers (ESPs).
In addressing CSAM detection, large platforms must balance proactive detection against the privacy rights of their users. Most companies implement automated detection using a tool called PhotoDNA, a technology that recognizes photos and compares them with a known database of illegal images. NCMEC converts known images of CSAM into long strings of letters and numbers — hashes — giving each image a unique signature. Once the images are hashed, companies can use the hashes to detect and remove copies of the images without having to maintain their own databases of CSAM, which would be illegal. If the companies detect any illegal images, they report attempted uploads to NCMEC, which reviews the reports in search of information to assess risk to victims and determine the location of the incident. NCMEC then refers the case to the appropriate law enforcement agency.48
Grading CSAM #
All CSAM content is illegal, but there are gradations in the horribleness of imagery. There is obviously an important difference between a video showing the live sexual abuse of a baby and a self-taken naked photograph that a 17-year-old sends to their partner. Companies use a general standard to label content in a way that is consistent, which they then communicate to NCMEC. This standard divides cases by the age of the child (visibly pre-pubescent versus pubescent), as well as whether the child is involved in a sexual act or in “lascivious exhibition.” Companies vary in how they apply these definitions, but “lascivious exhibition” generally categorizes images that are meant to create some sort of sexual desire (e.g., a child sitting on a seat nude with cameras pointing in a way that highlights their genitals). Images in which children are simply nude (e.g., a child running on the beach whose trunks have fallen down as he comes out of the water) are not necessarily lascivious. The image must be intended to arouse. Pretty much every major company forbids material that falls in any of these four categories, but the level will sometimes inform how aggressively companies scan for and report such material, and it certainly affects the aggressiveness with which law enforcement pursues the issues that are reported.
A (Prepubescent) | B (Pubescent) | |
---|---|---|
1 (sexual act) | Prepubescent minor, engaged in a sexual act | Pubescent minor, engaged in a sexual act |
2 (lascivious exhibition) | Prepubescent minor, engaged in lascivious exhibition | Pubescent minor, engaged in lascivious exhibition |
Table 3: Standard content labeling procedure. Source: NCMEC
People often ask whether companies are legally required to report artistic depictions of CSAM. The definition of child pornography (in 18USC2256) includes “Any visual depiction of sexually explicit conduct involving a minor,” which includes photos, video, or computer-generated images. In the past, this also included more artistic representations, such as cartoon porn (e.g., creating CSAM out of Bart Simpson images). However, the requirement to report such imagery was written out of the CyberTipline statute (18USC2258A) in 2018. The severity and volume of actual live abuse images is so high that law enforcement would simply not have the bandwidth to prosecute lower-impact cartoon cases.
What happens when you report to NCMEC? #
Figure 5: CyberTipline Workflow (Source: NCMEC)
NCMEC receives millions of CyberTipline reports. When a report arrives, NCMEC first evaluates the urgency of the report. Is there imminent danger to a child? Is this an enticement case? Is someone about to travel to meet a child for sex? Is someone extorting a child to send more images or perhaps abusing a younger child in the house? Such high priority cases are processed and sent to the appropriate law enforcement agency (whether in Idaho, Ireland, or South Africa) within hours.
At its best, the system works as it did in the Dabbs Postma case. In 2017, Facebook alerted NCMEC of a Facebook Messenger exchange between two users containing hundreds of photos and videos of child sexual abuse. NCMEC directed the report to law enforcement, who opened a criminal investigation and issued a subpoena to track the IP address of the perpetrator. Police arrested 44-year-old Dabbs Postma in Tampa and found CSAM in his home, including at least one video of him performing sex acts on an adolescent girl.49 Postma admitted to producing child pornography and having a sexual relationship with the girl.50
While cases such as Postma’s reveal the current system’s merits, there are several limits to the system as it stands. First, the massive volume of material makes it impossible to follow through on enforcement for all cases. NCMEC has been working to advance its case management tools and prioritize the most dangerous cases, but the scale is enormous and increasing. In 2023, the CyberTipline received 36.2 million reports related to suspected child sexual exploitation, including CSAM, child sex trafficking, and online enticements.51 Reports included more than 105 million videos, images, and files. There is not enough law enforcement in the world to tackle all the reported cases.
Colleagues and I spent a year investigating frustrations that platforms and law enforcement have with the entire CyberTipline pipeline.52 While respondents uniformly felt that the system is important and worth nurturing, we heard that law enforcement officers are overwhelmed with the volume of CyberTipline reports, and feel unequipped to accurately triage the reports to focus on those most likely to lead to the rescue of a child. Triage challenges are due to: 1) online platforms submitting low-quality reports that lack sufficient information for an investigation, 2) challenges NCMEC faces in rapidly improving their technology, and 3) legal constraints on NCMEC and U.S. law enforcement that make it difficult for them to quickly view files attached to CyberTipline reports. Law enforcement are often required to get a warrant before opening a file attached to a report unless a person at the platform viewed the file. Platforms may not always choose to expose moderators to this content, particularly if it’s a hash match to known CSAM. NCMEC similarly cannot open U.S. files unless the platform viewed them first.
Furthermore, the user bases of U.S.-based companies are truly global. In 2023, NCMEC referred 92 percent of the cases to international law enforcement.53 This makes the CyberTipline a global clearinghouse in addition to a domestic clearinghouse. (See below table for reports by country). Importantly, American companies are obligated to report to NCMEC even if they report the content to its regulator of origin. For instance, even if they report CSAM to Parisian law enforcement, they are still required to report it to NCMEC. Our research suggests that while U.S. law enforcement officers struggle to handle the volume of CyberTipline reports, other countries – and particularly developing countries – face even more obstacles. Law enforcement outside the U.S. are often unable to quickly obtain additional information from U.S. platforms that would be needed to pursue investigations.54
Country / Territory | Reports |
---|---|
India | 8,923,738 |
Philippines | 2,740,905 |
Bangladesh | 2,491,368 |
Indonesia | 1,925,549 |
Pakistan | 1,924,739 |
United States | 1,132,270 |
Saudi Arabia | 833,909 |
Turkey | 817,503 |
Algeria | 762,754 |
Iraq | 749,746 |
Mexico | 717,468 |
Egypt | 627,802 |
Colombia | 602,660 |
Brazil | 567,985 |
Morocco | 543,638 |
Vietnam | 533,236 |
Thailand | 432,468 |
Nigeria | 339,660 |
Yemen Arab Republic | 320,910 |
France | 310,519 |
Table 4: The 20 countries with the most CyberTipline reports in 2023. While NCMEC is a little vague on what these numbers mean, they likely refer to the number of CyberTipline reports where the CSAM uploader was located in the specified country. These numbers do not accurately indicate the level of child sexual abuse in a particular country, just the volume of reports. Additionally, country-specific numbers can be impacted by the use of proxies and anonymizers. (Source: NCMEC55)
In 2023 more than 80% of reports to NCMEC were made by Facebook and Instagram.56 This is natural, given the scale of companies like Meta that provide everything from chat to email to messaging to photo-sharing. Meta and other large companies not only have huge global user bases, but they also undertake massive voluntary, often automated, initiatives to screen and report content. Facebook, for example, scans every single image that transmits its network, including those sent over private messenger (with the exception of pictures sent over WhatsApp, which is end-to-end encrypted). While Facebook has the largest number of users globally, it also devotes the most resources–both technology and personnel–to the task of detection. If every company performed the same level of detection, NCMEC would probably receive hundreds of millions of reports. Many platforms, however, don’t possess the necessary resources, or they choose not to spend them to address this issue.
For example, Dropbox and Google scan data only when users attempt to share it. Perpetrators have easily evaded such methods, by sharing login information for Dropbox accounts storing CSAM, rather than sharing the images directly. In general, the benefits of proactive detection are not always obvious to companies, and aggressively screening content can negatively impact companies’ bottom lines–requiring companies not only to shut down accounts that harbor illicit material, but also to spend money to hire teams and develop technical tools for detection.
Until recently, NCMEC did not disclose reporting details from individual companies because they did not want to disincentivize companies from proactively looking for CSAM and evidence of CSE on their platforms. However, in 2019, NCMEC began sharing reporting numbers from various companies. In 2021, for instance, NMEC received more than 22 million reports from Facebook, 512,000 from Snapchat, and 86,000 reports from Twitter.57 When considering these numbers, remember that there is a big difference between top-line detection of adversarial behavior and actual prevalence. Every company has to strike its own balance between privacy and security.
Platform Responses #
Fighting sexual exploitation online requires a multidisciplinary, multi-tiered approach. While companies can develop technical tools to detect illegal material, removing material is only half the battle. Ultimately, the goal of every company should be to remove the victims from immediate harm, which is only possible through collaboration with diverse stakeholders from NCMEC to global law enforcement and NGOs. Scalable big data approaches are important but must be combined with manual investigations, as it is nearly impossible to automate expertise and contextual clues. Industry partnerships are key, and direct engagement with law enforcement and NGOs drastically increases the chances of successful outcomes.
Policy Responses #
Remember, CSE is an inevitability. Any platform that allows people to share files with people will eventually end up with a CSAM case. It is far better for a company to adopt proactive policies, rather than scramble to address an incident in the 48 hours after it is reported.
Reach out to NCMEC to have a conversation #
Early in a company’s development, it should reach out to the National Center for Missing and Exploited Children (NCMEC) (at [email protected]) to have a conversation about steps to keep children safe on the company’s platform.58 NCMEC has dedicated points of contact who serve as representatives for ESPs. These representatives will gather whatever resources might be helpful to a new company and can help a company establish a process for reporting CSAM. They can also help established companies expand their detection and reporting practices. Furthermore, NCMEC can connect the company with appropriate resources that might be relevant to its particular country or jurisdiction, since NCMEC collaborates with hundreds of organizations around the world.
ESPs can opt into NCMEC’s hash-sharing initiatives, which comprise hashes from previous reports, ESPs, and NGOs. Beyond known child sexual exploitation hashes, NCMEC also compiles hashes of content that is sexually exploitative but may not meet the legal definition of CSAM. These millions of hash entries can be used with PhotoDNA (discussed below) or similar technology to screen for content on servers and platforms.
Beyond hash-sharing initiatives, NCMEC hosts bi-annual roundtables for ESPs and domestic and international law enforcement to learn about CSE trends, the CyberTipline process, and outcomes of law-enforcement investigations. Even for relatively small and young companies, NCMEC can provide insights on how to design your product safely from the start.
Yiota Souras, General Counsel at NCMEC, says that the biggest mistake that people starting companies make related to CSE is that they simply don’t give any thought to these issues until it is too late. Founders are focused on building the product, getting their finances in order, and putting the product into users’ hands. As a result, companies that experience CSE-related issues (nearly every company) will often find themselves in a situation in which it is too late to go back and ask “how do I build in some safety mechanisms?”
Create Detailed Policies for the Relevant Types of CSE #
Companies should explicitly ban child nudity and exploitation in their terms of service. They should also make users aware that cases will be referred to law enforcement. Almost all U.S.-based platforms hosting user-generated content explicitly disallow any content that is illegal in the United States, including CSAM, and most maintain their own content policies that go well above and beyond what is required by US law.
However, such rules can sometimes be hard to operationalize. Take one high profile example: the Pulitzer-prize winning “napalm girl” photo from the Vietnam war, featuring the naked nine-year-old Kim Phuc running away from a napalm attack. The writer Tom Egeland shared this photo on Facebook as part of a series of seven that he said “changed the history of warfare.” Facebook deleted the post and suspended Egeland’s account. At the time, Facebook’s had a zero-tolerance policy for naked photos of children, even if the photos did not match the legal definition of child pornography. This rule both prevented people from pushing the boundaries of permissibility and was much easier for the company to enforce.
Figure 6: The “napalm girl” photo by Nick Ut. (Source: AP/Fotografiska59)
Facebook’s decision to delete the post was criticized by many, including Norwegian Prime Minister Erna Solberg. Solberg herself shared the image, noting that she appreciated “the work Facebook and other media do to stop content and pictures showing abuse and violence … But Facebook is wrong when they censor such [important, historical] images.”60 This post was deleted by Facebook, to which she responded, “It is highly regrettable that Facebook has removed a post from my Facebook page. What they achieve by removing such images, good as the intentions may be, is to edit our common history. I wish today’s children will also have the opportunity to see and learn from historical mistakes and events.”
A number of Europeans gawked at the company’s American puritanism and effectively pressured Facebook to make an exception for the photo.61 Any platform that allows the sharing of images will need a policy to address such cases, as well as a range related cases. For example, should a company permit a mother to share a photo of a naked baby with her grandmother? How should it handle inappropriate comments on innocent images or videos that sexualize minors? All of these possibilities should be considered ahead of time, and definitions and limits should be laid out as clearly as possible in the Terms of Service. Addressing these issues ahead of time is important, as it allows a company to be consistent with its users. Of course, ultimately, the terms will need to be enforced with a mix of automated and human engagement.
Hire Policy Teams with Legal Expertise #
Companies should hire lawyers and policy teams that have understanding of the specific legal carve outs for companies transferring CSAM. For small startups, it is worth investing in outside counsel with experience related to 2258 and NCMEC reporting. The average lawyer will not understand the complexities around CSAM, and even an initial consultation with an experienced attorney in this area can cover considerations that a company will need to address regarding data privacy, technical tools, and workflow.
Engage with Coordinating Bodies #
CSE is a global phenomenon that requires a global response. Content and illegal activity do not respect national borders. Governments, companies, NGOs, helplines, and tiplines all need to work together.
Figure 7: Multi-stakeholder engagement is essential to addressing online child sexual exploitation. (Source: Victoria Baines62)
Many of the actors in this admittedly complicated ecosystem gather together each summer in Dallas, Texas for the Crimes Against Children Conference. This conference, which predated the Internet, was once primarily about in person abuse and kidnapping of children. Now, however, it is dominated by the Internet. While the conference is difficult emotionally–it is the only conference at which I have cried during the keynote–it also affords companies an excellent chance to connect with people from the wide variety of institutions that work to protect children.
Companies should also seek out and engage with coordinating organizations. For instance, since 2006, the global Tech Coalition has brought together leaders in technology to collectively work on child safety issues and provide best-practices for younger companies. The Tech Coalition recently launched Lantern, an initiative to facilitate signal sharing about child sexual exploitation among platforms.63 Beyond NCMEC there are several other major organizations working on child safety of which companies should be aware and with whom they should build relationships, including Canada’s project Arachnid, Thorn in southern California, and several in Europe.
Investigative and Emergency Teams #
Currently, technology companies vary greatly in the personnel they devote to investigating child harm. All companies that house public interactions, especially those that could include children, should have investigative teams trained to evaluate child harm. These teams should have direct connections to organizations such as NCMEC and its counterparts in other countries. Teams should investigate reports and clear them as quickly as possible to increase the probability that law enforcement is able to remove the child from harm’s way. At a mature, large company, these teams will include people with diverse professional experience, including people who have previously worked in law enforcement related to CSE.
Publish Transparency Reports on Child Harm #
Presently, companies are sometimes reluctant to engage in active detection because doing so may lead to a large number of reports of CSAM, which they fear will make the platform look like a den of iniquity. Detection, of course, does nothing to change the underlying facts of the matter, and, indeed, companies that fail to publish transparency reports about child harm on their platforms should be viewed with suspicion. As an industry, we need to set a norm that it is a sign of strength and active enforcement to report high numbers to NCMEC. Public pressure has led to an increase in reporting over the last few years from Google to Pornhub, a promising trend.64
Create Awareness Programs for Children and Parents #
The public tends to lack awareness of CSE on platforms, especially those that they associate with seemingly innocent games or educational tools. One way that companies can help combat this lack of awareness is to develop educational material for children and adolescents and perform outreach in their communities. While many schools teach sex-ed, few properly inform their students about how to protect themselves from online exploitation.
In most cases, there is no realistic way for a child victim to take any steps to prevent their sexual abuse at the hands of an adult. However, in a minority of cases, especially ones involving strangers and virtual contact, properly educating potential victims could reduce the horrible impact of these crimes. Sextortion and grooming are often seeded by an initial contact by a stranger who will use lies, inducement, or threats to create a twisted relationship in which a child will participate in their own abuse. Educating children, and especially teenagers, about the existence of these abuses, and of the options available to them, can help to reduce the effectiveness that these predators find in their initial approaches.
I have, for instance, visited my own kids’ schools to give talks about the potential dangers of social media. While many of the squirming 13-year-olds in the auditorium find such talks to be quite awkward–perhaps none more so than my own child–the talks have an obvious impact. Even in a one-hour session, there is often a marked change in the attitude of the students by the end of the talk.
While individuals have long used Photoshop to add nude bodies to faces of people they know, AI has made this process more accessible. Today, hundreds of “nudify” websites and apps exist, and many permit users to make nude images of children. In 2024 there were a handful of high-profile incidents where students used these services to make nude images of their peers.65 Some advocate for more school-based education about consequences – including legal consequences - of this behavior. Others worry about the risk of inadvertently educating students about the existence of these apps. A few schools focus more broadly on teaching digital consent, but many have yet to address this issue at all.
The most important thing we can teach our kids is that there are adults that love them no matter what they have done, and that they can always reach out to help. The most tragic CSE situations with which I have been involved, in which the victims have ultimately taken their own lives, are those in which the victim did not feel that there was an adult in their life to whom they could reach out to for help. Letting children know the adults will respond with concern, rather than anger, when they are truly in trouble is crucial. If children feel safe confiding in adults when they are in trouble, we can help extricate them from abusive situations.
Create Resiliency Programs for Employees #
It is crucial that companies recognize the mental toll such investigations can have on employees. If a company is dealing with CSE on a regular basis, it needs to consider creating resiliency programs for the employees directly responsible for handling the investigations. At Facebook I had a full-time team dedicated to child safety investigations, many of whom experienced significant emotional distress. Some people started hurting themselves. Others experienced suicidal ideation. One employee became violent in his personal life.
It’s harder than you might think to put in place support structures for your employees, since companies cannot force employees to talk to psychologists.66 Rather, companies must create a framework in which it’s both accepted and welcome that employees will reach out for extra help when needed, and in which access to help is readily available.
Nobody understands the toll of this work better than NCMEC analysts who spend hours each day looking at cybertipline reports to try to ID victims. This is why, from the beginning, NCMEC designed a safe-guarding program that includes staff and external therapists that work with the team in both individual and group sessions. Some of their programs for employees are mandatory, while others are voluntary therapeutic sessions. NCMEC also offers assistance to smaller companies starting resiliency programs (another reason that companies should get in touch with NCMEC early in their lifecycle).
Product and Technical Responses #
Since CSE content is illegal in most jurisdictions of the United States, image-based offenses can be automated to some degree. However, other activities, such as solicitation and grooming, are more contextual and require different, more active forms of response.
Provide user-reporting mechanisms #
At the very least, companies should provide reporting mechanisms for users. So far, the reporting mechanisms we’ve talked about in the book have been for the direct recipients of harmful material. Another more complicated kind of reporting allows reporting from people who might be victims of content that has been traded.
Image and Video Fingerprinting #
Beyond merely processing individual reports of CSAM from users, companies can also use technical tools to proactively detect CSAM. Hundreds of companies proactively scan for child sexual abuse material in concert with NCMEC. Most companies implement automated detection using a tool called PhotoDNA, a technology that recognizes photos and compares them with a known database of illegal images.
One of the main advantages of PhotoDNA, which was created by UC Berkeley professor Hany Farid while working at Microsoft, is that it is able to detect previously identified illegal imagery even if the image has been slightly altered.67 When an image is uploaded, it is turned into a square and the color is removed. An algorithm then finds the edges, which signify identifying features. These results are split into a grid, and each square is assigned a value based on its visual features. The values of each square are then used to generate the image’s unique hash, or “fingerprint”, which is then compared against fingerprints of known illegal images. If the fingerprint is sufficiently similar to a fingerprint in the database, the system reports a match. PhotoDNA can perform millions of comparisons per second, even accounting for “subtle differences between images such as color changes, resizing, and compression.”68 In 2015, Farid updated PhotoDNA so that it can detect known instances of video CSAM in addition to imagery. Social media companies currently use this technology to detect millions of photos and videos that circulate on the internet.
Figure 8: Processing steps for PhotoDNA. (Source: Hany Farid69)
Figure 9: How PhotoDNA works. (Source: Microsoft70)
There are, however, many limitations to PhotoDNA. First, there is no global list of illegal material against which to compare the digital fingerprint. NCMEC compiles digital fingerprints of CSAM as technology companies and the public report them, ever expanding the database of known CSAM. Other countries have their own databases, but the effectiveness of these databases is dependent on both the technological and legal capabilities of the country. Even within the United States, tech companies vary in how effectively they can counter illegal imagery at scale on their platforms.
Additionally, it is critical to tackle content across platforms — otherwise, once one platform implements measures to block CSAM, abusers will simply move to other online locations. Some companies have taken proactive steps to encourage collaboration in CSAM detection. For example, Facebook recently released its own PhotoDNA-inspired photo and video-matching technologies to detect identical and nearly identical photos and videos. Following Microsofts’ lead, Facebook open-sourced its new technologies so that partners in industry, smaller developers, and non-profits could use them to identify abusive content and share hashes.
Furthermore, training AI on material has complicated legal boundaries. Since the passage of the REPORT ACT in 2024, companies are required to hold onto known CSAM for one year after reporting to NCMEC.71 But previously companies were only legally obligated to retain CSAM for 90 days, and many interpreted this to mean that they only had 90 days to do whatever they needed with the image to create its unique digital fingerprint and train machine learning algorithms on it. In theory, NCMEC should be training the networks to create a model to detect such images, since it is legally allowed to hold the images; in reality, the organization lacks the technological ability to build effective identification algorithms. Collaboration with tech companies over the past 10 years has led to significant advances in this area, and such partnerships should continue.
Beyond known CSAM hash-lists, NCMEC has a list of “sexually exploitative” images. These images, which fall short of CSAM, identify images of children that may be precursors to abusive images. Some companies choose to proactively prevent such images from being traded on their platforms because they understand this is part of a broader ecosystem of abuse.
For companies that are unable to implement photoDNA through the use of in-house teams, there are a number of companies that offer scanning services. Beyond scanning for CSAM, these companies can also assist with more complex types of moderation.
New Imagery and Videos #
PhotoDNA allows companies to find images in cases where at least one provider has the raw initial image that has been hashed. But what about the new CSAM that’s being created on these platforms?
There has been progress generating both automated and manual solutions to this challenge. A number of companies and researchers are experimenting with machine learning algorithms to detect new CSAM. Part of the challenge with detecting new CSAM is that the false positive rate (the number of images that are mistakenly flagged as CSAM) will be much higher than that of well-implemented PhotoDNA of known images. If a company is able to develop a high quality, high precision classifier that is able to find new CSAM and run on live video, it will be able to stop a huge amount of abuse that happens online.
A machine learning classifier for new CSAM needs to do two things: (1) detect sexual content using a classifier that determines the likelihood that there is nudity in an image and (2) add it to a classifier that tries to determine the age of the person involved. The second task, while much easier for children, becomes quite difficult for adolescents and teens.
Developing these technologies is challenging for a number of reasons. First, to train the classifier, the company needs a decent amount of content, but as discussed above, there are limits to how long companies can hold onto CSAM for such purposes.72 Second, when holding onto content to train a classifier, companies must handle the CSAM like it is radioactive material. Engineers cannot look at the imagery, as doing so would be both illegal and horrible for their mental health. At Facebook, when new CSAM content is found, it is taken off of the normal servers and put into a computer cluster that handles all content relevant to law enforcement. The law enforcement clusters have heightened security, as it is a prime target for hackers.73
The proliferation of AI-generated CSAM has underscored the importance of technology for detecting new CSAM. AI CSAM is harmful and it is important to detect, report, and remove it. AI CSAM may be based on a real photo of a child that an adult has access to. It may be derived from old CSAM, revictimizing survivors. And some experts believe that viewing AI CSAM may normalize the depicted behavior and increase the risk of contact abuse.
Flag Potential Grooming Behaviors Using Known Indicators #
In addition to detecting the existence or sharing of CSAM, companies can also intervene in situations, such as the “chat” phase, in which it is likely that CSE will result. Automated technologies can flag risk-taking scenarios, such as when a child shares a photo with a new adult friend. Microsoft and Meta are leading the way in developing tools to flag potential grooming behaviors on their platforms, and have committed to make these tools available to smaller companies.
Enforce Identity Indicators Around Age #
Many child predators disguise themselves as characters — a young girl, a gay friend, a support figure on an anorexic recovery community website — in order to gain trust of young girls. If children are using a company’s platform, it is critical that the company integrate lessons about identity we discussed in chapter 2.
Everybody lies about their age online. There are a surprising number of 107-year-olds to be found on Facebook at any moment. One way to detect this misrepresentation is to infer the age of a user based upon their social network and the content they create. At Facebook, we had a classifier that tried to determine the apparent age of a user. If somebody said they were 27 but were really 14, we would try to detect that. Such knowledge could then be put into a formula that would calculate whether or not that person is possibly being groomed.
One way to minimize the possibility of grooming is to require users to verify their age and isolate them into communities with only others in that age group. While this does not itself prevent people from lying about their age, this kind of gating makes it less likely that there will be significant interactions between age groups and makes it more challenging for adults to discover children.
Restrict Discovery of Children by Unfamiliar Adults #
Restricting the ability for people to discover and then reach out to strangers is a key part of preventing CSE. If groomers are able to discover and reach out to children on a platform, they will become friends with one child and then find mutual friends.
Moderation on End-to-End Encrypted Platforms #
While there has been much discussion of the negative impact that encryption will have on the ability to protect children online, there are still ways of proactively detecting such abuse. Even with E2E, there will still be indicators available to help determine that an account is trading CSAM. For instance, CSAM traders need to somehow indicate to outsiders that they are a person with whom other traders should interact; some do so by making their profile pictures CSAM. Additionally, CSAM trading often occurs in private groups. Law Enforcement can infiltrate these spaces covertly and collect evidence of the crime. Of course, companies cannot catch everything, but there are actions that can still be taken.
Epilogue #
Behind the capture of Buster Hernandez at the beginning of this chapter was the indefatigable work of several dedicated Facebook employees. In this brief epilogue, I provide a bit more detail about our efforts. For Muriel Tronc, this case had started in September of 2014 with a referral from law enforcement. At the time, Muriel had just transferred to Facebook’s child safety investigations team from the team responsible for responding to legal requests, becoming only the second investigator dedicated to the crime of sextortion. There was no sign of what was to come in that initial case, the pattern of which matched many other sextortion cases. A newly created account reaching out to a teenage girl, a lie about possessing nude photographs of the victim, and a threat to release them unless the victim sent additional photographs, which the perpetrator then leveraged into more content and sadistic control over the life of the victim.
What Muriel would quickly learn, however, was that this was no normal sextortionist. For starters, this person was much more careful to cover his tracks, using an anonymous email account and the Tor network to obscure his IP address. More disturbingly, it would become clear that this individual was much more dedicated and successful than other offenders the Facebook team had tracked, spending 8-12 hours a day sending his initial lure to teenage girls around North America. The file on this offender, who called himself Brian Kil to more than one victim, only grew as more victims were discovered and the team matched his technical techniques and messaging patterns between different accounts and victims.
By the time I joined as CSO in June of 2015, the file on Kil had grown to several dozen victims, and he was already the #1 target of our child safety team. My time at Yahoo had exposed me to the evils of CSAM trading rings, as well as the large child sex slavery ring our team helped bust, but the lack of a popular Yahoo messaging product with teenagers meant that sextortion was not a crime with which I was familiar. That changed almost immediately after I took my new position, as Facebook Messenger, with its ability to message random strangers who could be found via searches across the entire user graph, seemed custom-built to enable this type of abuse.
By early 2017, Facebook’s victim list had grown to more than 100 teenage girls. The FBI had received authorization to try to uncover Brian Kil’s real identity with what they euphemistically call a Network Investigation Technique (NIT), but what security professionals would generally call malware. This attempt failed, and when Kil noticed the backdoored file, he explicitly taunted the FBI with a graphic showing him towering above the FBI seal. The Facebook team had gone further than ever before to track Kil, even writing a custom detection routine trained on his initial approach, which it ran against every message sent on Facebook Messenger. While this was often successful in finding new aliases before victims sent their first content, some messages fell through the cracks and the victim list continued to grow.
By this point, we had assigned a security engineer to support Muriel, who had long since cleared her desk of all other cases to focus solely on BK. That spring, the two of them approached me with a radical idea. What if we helped the FBI to ensure that their next attempt was more likely to succeed? Kil had finally made a mistake. His ubiquitous use of Tor had raised suspicions that he was possibly using an operating system that was built with Tor deeply integrated. As Muriel dove back through the images that she had attached to dozens of NCMEC reports looking for evidence, she noticed a screenshot of a video that demonstrated that Kil was running the Tails operating system.
Unlike other operating systems that could support Tor usage, either via the Tor Browser or Tor Proxy, Tails was designed to never route network data via the public internet, preventing the kinds of mistakes that had landed hundreds of online child abusers in jail in the past. Now that we knew exactly what OS he was running, my colleagues wanted to try to create an NIT, even if that meant going up against what was advertised as an intentionally secure OS.
I approved this initial foray, not expecting much to come of it. I was wrong to underestimate the skill of our security engineer, who eventually figured out how to wind our way through the protections of Tails. Like all Linux distributions, Tails was not a singular product but a collection of hundreds of open-source programs that were packaged together. Our engineer had discovered that a serious flaw had been patched in the default Tails video player “upstream” of the distribution, and that the patched version had yet to be included in the OS. While he would have to figure out ways to defeat the anti-exploit protections of Tails and to disable Tor temporarily, this was a tantalizing opportunity to catch a perpetrator who opened new video files from his victims every day.
Developing such an exploit was a heavy task, and we engaged one of our existing red team vendors to help build out the exploit chain. We also notified the FBI of the possibility of a new NIT. For their part, the FBI and US Attorney’s office sought another warrant that would authorize them to place the malware on the computer. While the exploit was finished and tests against Tails virtual machines started to come back positive, a massive wrinkle appeared: the vulnerability was patched in the beta release of Tails. While we expected that Kil was running the release version of the operating system, this meant that we might only have a matter of days before the standard auto-update would make him immune to the exploit chain we had assembled.
It was decision time. Providing this exploit to the FBI would put us well outside of the normal bounds of how tech companies cooperate with law enforcement. It would also go against the loose tenants of the larger security community, which often prioritized privacy over almost all other equities. There were, however, several factors that made me feel comfortable moving forward. First, we were not making any changes to a Facebook product, and this technique posed no risk to other users. This was not a backdoor, and, in fact, the exploit would be delivered by the FBI via another platform, Dropbox, which had become Kil’s favorite service for receiving files from his victims. Second, the flaw we were exploiting had already been fixed in the original source, and was slated to be fixed in Tails at any moment. Third, we had taken steps to prevent the FBI from reusing the exploit, as we only provided them with a virtual machine in our control where they could upload the file and download it with the exploit attached. While they could still reverse-engineer the exploit out of the resultant file, we believed the risk that it would be used in other cases (where we might be less sure of the ethical issues) before it was patched in Tails to be minimal.
In the end, I couldn’t put Kil’s crimes out of my head. It was clear to me that his crimes would have been much more challenging to pull off without Facebook, and that the company had a responsibility to take extreme steps to stop such extreme harm facilitated by our products. So in the end, backed by our legal team and general counsel, we provided the FBI with the capability. This capability would have been useless, however, without the bravery of one young woman.
As described at the outset of this chapter, “Sara” (described as “Victim 2” in court documents) had been put through hell for years, as Kil sextorted her secretly while making threats against her sisters, little brother, and parents. While we put the finishing touches on our Tails exploit, the FBI agents, who had spent years tracking Kil, approached her with her parents and offered an opportunity that did not exist for any other victim: the ability to strike back.
And so this young woman, who had newly come into adulthood after years of continuous sadistic punishment by Kil, made him one final video. This time she was facing the camera fully clothed, a look of determination on her face, and told him that she was free of his grasp. While Buster Hernandez watched this video and heard her words, he had no idea how true they were, as the exploit code inserted at the head of her video ran silently in the background and sent a single special packet via his computer’s normal network connection instead of Tor. In response, the man, now revealed to be Buster Hernandez, sent the girl a picture of a knife, not knowing that within hours the FBI would begin surveillance of his dank Bakersfield home.
-
https://www.wxyz.com/news/sextortion-terror-michigan-woman-helps-fbi-bust-man-accused-of-extorting-teens ↩︎
-
Ibid. ↩︎
-
Since Sara was no longer a minor, she was legally allowed to help the FBI engage in their sting. ↩︎
-
https://mediaassets.turnto23.com/document/2017/08/07/Hernandez.Buster.CO_Redacted_63830898_ver1.0.pdf ↩︎
-
https://mediaassets.turnto23.com/document/2017/08/07/Hernandez.Buster.CO_Redacted_63830898_ver1.0.pdf ↩︎
-
https://www.indystar.com/story/news/crime/2020/02/06/brian-kil-submits-guilty-plea-sexual-extortion-school-bomb-threats/4683227002/ ↩︎
-
On platforms that require authentication of user’s identities, Child Sexual Abuse Material (CSAM) is less of a problem. However, since most companies have some free or trial tier that can be used with little authentication, most companies will face these challenges. ↩︎
-
Some researchers colloquially call this the “Lego penis problem,” referring to Lego Universe, a Massively Multiplayer Online Game (MMOG) launched in 2010 that allowed users to create and share Lego structures. Users started a trend building penises from Legos and shared them with other users. Workers at the company lamented their inability to “ detect dongs” at a rate fast enough to prevent the sullying of their child-friendly brand. They shut down the game less than a year and half after launch. Source: https://web.archive.org/web/20220810200105/https://twitter.com/glassbottommeg/status/604407061380640768 ↩︎
-
You may also hear people talk about online sexual exploitation (OCS). ↩︎
-
Section 2256 of Title 18, United States Code, defines child pornography as any visual depiction of sexually explicit conduct involving a minor (someone under 18 years of age). ↩︎
-
The WePROTECT Global Alliance (WPGA), an international movement dedicated to national and global action to end online child exploitation decided in 2016 that the phrase ‘child sexual abuse material’ (CSAM) more “accurately captures the heinous nature of sexual violence and exploitation of children while protecting the dignity of victims.” https://static1.squarespace.com/static/5630f48de4b00a75476ecf0a/t/5deecb0fc4c5ef23016423cf/1575930642519/FINAL+-+Global+Threat+Assessment.pdf ↩︎
-
https://www.researchgate.net/publication/327306300_Online_Child_Sexual_Exploitation_Towards_an_Optimal_International_Response ↩︎
-
https://www.researchgate.net/publication/327306300_Online_Child_Sexual_Exploitation_Towards_an_Optimal_International_Response ↩︎
-
https://www.cbc.ca/news/canada/calgary/ex-pm-adviser-tom-flanagan-sorry-for-child-porn-comments-1.1342217 ↩︎
-
Flanagan ultimately apologized for his statement*.* ↩︎
-
https://www.nytimes.com/2011/06/10/nyregion/pornography-victim-makes-voice-heard-in-queens-case.html ↩︎
-
Ibid ↩︎
-
Ibid ↩︎
-
The only other form of abuses for which there are extensive legal frameworks are incitement and terrorism. However, the definitions for the latter vary widely across borders, and dealing with them is complicated by the fact that governments often incite violence against their own citizens. ↩︎
-
https://cdn.icmec.org/wp-content/uploads/2023/10/CSAM-Model-Legislation_10th-Ed-Oct-2023.pdf ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/OJJDP-NCMEC-Transparency-CY-2023-Report.pdfhttps://www.missingkids.org/blog/2021/rise-in-online-enticement-and-other-trends--ncmec-releases-2020- ↩︎
-
https://law.justia.com/cases/federal/appellate-courts/F3/74/701/594242/ ↩︎
-
https://www.washingtonpost.com/wp-srv/style/longterm/books/chap1/sexlaws.htm ↩︎
-
https://www.ncjrs.gov/App/Publications/abstract.aspx?ID=161366 ↩︎
-
Digital Child Pornography: A Practical Guide for Investigators https://books.google.com/books?id=FOyLAwAAQBAJ&pg=PA8&lpg=PA8&dq=bbs+child+porn&source=bl&ots=vSN2EcnYM7&sig=ACfU3U1KCsQMipSBnzuucRoyjDazIcaOJg&hl=en&sa=X&ved=2ahUKEwjxzaGei6HpAhXQl54KHe0SB9YQ6AEwAHoECAcQAQ#v=onepage&q=bbs%20child%20porn&f=false ↩︎
-
For extensive discussion of NCMEC’s role as a governmental agency see Ackerman vs. US — 18 U.S.C. § 2258A and 42 U.S.C. § 5773(b). We also summarize these issues in a report on the online child safety ecosystem: https://cyber.fsi.stanford.edu/news/cybertipline-report ↩︎
-
“An analysis of one particularly egregious Tor-based website conducted by the U.S. Federal Bureau of Investigation (FBI) found that it hosted approximately 1.3 million images depicting children subjected to violent sexual abuse.” Further, an “FBI investigation of a single website hosted on Tor had approximately 200,000 registered users and 100,000 individuals had accessed the site during a 12 day period.” The National Strategy for Child Exploitation Prevention and Interdiction, U.S. Department of Justice, Apr. 2016, at https://www.justice.gov/psc/file/842411/download (last visited Nov. 16, 2018) (on file with the International Centre for Missing & Exploited Children). https://static1.squarespace.com/static/5630f48de4b00a75476ecf0a/t/5deecb0fc4c5ef23016423cf/1575930642519/FINAL+-+Global+Threat+Assessment.pdf ↩︎
-
https://www.9news.com.au/world/richard-huckle-britains-worst-pedophile-facing-life-in-jail/d4c1881e-d31e-47d0-80a5-f3a35ed17126 ↩︎
-
https://www.telegraph.co.uk/news/2016/06/01/britains-worst-paedophile-who-targeted-poverty-stricken-children/ ↩︎
- ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/OJJDP-NCMEC-Transparency-CY-2023-Report.pdf ↩︎
-
For information on challenges related to the CyberTipline, see https://cyber.fsi.stanford.edu/news/cybertipline-report ↩︎
-
https://stacks.stanford.edu/file/jd797tp7663/20230606-sio-sg-csam-report.pdf ↩︎
-
Virtual Global Taskforce (VGT) and European Cybercrime Centre (EC3), Virtual Global Taskforce Child Sexual Exploitation Environmental Assessment Scan 2015, Oct. 2015, at https://www.europol.europa.eu/sites/default/files/publications/vgt_cse_public_version_final.pdf ↩︎
-
https://www.nytimes.com/interactive/2019/11/09/us/internet-child-sex-abuse.html ↩︎
-
https://web.archive.org/web/2/https://www.missingkids.org/cybertiplinedata ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/OJJDP-NCMEC-Transparency-CY-2023-Report.pdf ↩︎
-
https://www.cp24.com/news/doctor-teachers-among-those-arrested-in-3-year-international-child-sex-abuse-investigation-police-1.3882674 ↩︎
-
https://www.nytimes.com/interactive/2019/12/07/us/video-games-child-sex-abuse.html ↩︎
-
https://www.justice.gov/criminal-ceos/citizens-guide-us-federal-law-child-pornography#:~:text=Section%202256%20of%20Title%2018,under%2018%20years%20of%20age).&text=Notably%2C%20the%20legal%20definition%20of,child%20engaging%20in%20sexual%20activity. ↩︎
-
For a great discussion, see Jeff Kosseff https://www.lawfareblog.com/online-service-providers-and-fight-against-child-exploitation-fourth-amendment-agency-dilemma ↩︎
-
https://www.tampabay.com/news/publicsafety/crime/Tampa-man-charged-with-producing-sending-child-pornography-through-Facebook_162295889/ ↩︎
-
https://www.tampabay.com/news/publicsafety/crime/Tampa-man-charged-with-producing-sending-child-pornography-through-Facebook_162295889/ ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/OJJDP-NCMEC-Transparency-CY-2023-Report.pdf ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/OJJDP-NCMEC-Transparency-CY-2023-Report.pdf ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/2023-reports-by-country.pdf ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/2023-reports-by-esp.pdf ↩︎
-
https://www.missingkids.org/content/dam/missingkids/pdfs/2021-reports-by-esp.pdf ↩︎
-
You can find even more information at https://www.missingkids.org/gethelpnow/cybertipline and https://report.cybertip.org/ ↩︎
-
https://www.theguardian.com/technology/2016/sep/09/facebook-deletes-norway-pms-post-napalm-girl-post-row?CMP=gu_com ↩︎
-
The same people now complain about Facebook not having strong enough policies in a number of cases, but forcing European politicians to be consistent on content moderation is an impossible task. ↩︎
-
https://www.researchgate.net/figure/Select-stakeholder-groups-in-the-fight-against-online-child-sexual-exploitation-Note_fig1_327306300 ↩︎
-
https://www.technologycoalition.org/newsroom/announcing-lantern ↩︎
-
CyberTipline report quality is also important, as higher quality reports are more likely to be investigated by law enforcement. Google has a reputation for submitting very high quality CyberTipline reports. ↩︎
-
https://lancasteronline.com/news/local/2-juvenile-males-charged-with-creating-ai-generated-nude-photos-of-lancaster-country-day-school/article_30ab81c0-b360-11ef-a447-8373134c55f7.html ↩︎
-
Facebook had an internal coordinator, but we determined that actual psychological services had to be provided by external professionals due to restrictions under HIPPA and ERISA. ↩︎
-
This is a crude summary of the way in which PhotoDNA works. For a detailed approach, see Hany Farid’s “Reining in online abuses,” Technology & Innovation 19.3 (2018): 593-599. ↩︎
-
https://farid.berkeley.edu/downloads/publications/nai18.pdf ↩︎
-
https://www.congress.gov/bill/118th-congress/senate-bill/474 ↩︎
-
There is legislation now pending that would extend the period of time that companies can legally hold onto images specifically so they can train and create new tools. This would be a big win for companies and for children if it were passed. ↩︎
-
Famously, when the People’s Liberation Army broke into Google in 2009, they targeted the law enforcement cluster in hopes of discovering wiretaps on Chinese spies in the United States. ↩︎