Landmark case against Twitter could help Peel Police’s ICE unit fight onslaught of online child exploitation 
Feature image from the Canadian Centre for Child Exploitation

Landmark case against Twitter could help Peel Police’s ICE unit fight onslaught of online child exploitation 


Content warning: This story describes instances of graphic child sexual abuse that some readers may find disturbing. 

 

Businesses that operate in the digital world are facing a reckoning over how they handle the startling amount of child sexual abuse material posted on their platforms.

The rise in child exploitation on the internet — driven by changes in technologies like cloud storage and the mass popularity of video sharing services and tools that allow anonymous communication and surfing on the web — has led to an onslaught of child sexual abuse material (CSAM) polluting the cyberworld, and a surge in abuse as perpetrators seek to meet the demand for more violent content featuring younger and younger child victims. 

The rapid spread of this material online has been labelled a “social epidemic” by the Canadian Centre for Child Protection (C3P), one that requires an updated, more stringent approach to holding internet service providers and other companies that enable these heinous acts accountable for allowing them to proliferate. 

“The overwhelming pace of technological progression, along with the significant online offender population, has resulted in a lack of cohesiveness in responses to child sexual abuse imagery around the globe,” the C3P wrote in 2019.

The disconnected response from lawmakers, and a lack of funding for municipal police forces to adequately address the issue, has allowed the amount of CSAM in digital internet channels to increase at an alarming rate.

Local police forces, unlike their larger provincial, national and international counterparts, often lack the resources and expertise to address the highly complex nature of organized digital crime with tentacles that spread out across the vast global network of highly sophisticated cyber criminality.

In less than 10 years, the National Centre for Missing and Exploited Children (NCMEC) in the United States — responsible for monitoring one of the world’s busiest Cybertip hotlines for reporting this type of content — saw the number of reports of CSAM increase more than 1,500 percent from 1.1 million in 2014 to over 18 million in 2018. The organization received more than 21 million reports last year, an increase it says is partially a result of the COVID-19 pandemic keeping children indoors, increasing their screen time and their chances of running into a perpetrator on the many social media apps they hunt on. 

In 2018, Canada’s Cybertip.ca assessed double the amount of potential CSAM than it had in the previous 15 years combined. C3P has seen its workload balloon from approximately 4,000 reports per month only a few years ago to over 100,000 per month, driven by its artificial intelligence, Project Arachnid, which trawls the internet searching for potential CSAM. Because these images need to be assessed by a human moderator before being sent off to law enforcement for further investigation, there is a backlog of over 30-million images according to a June 2021 report. 

“The rate at which Project Arachnid detects suspect media far outpaces the human resources available to assess the content,” the C3P wrote. 

The real tragedy is lost in these numbers. These millions of reports represent potentially millions of children who have not only been stripped of their clothes, but their innocence, and their mental stability by the criminal desires of adults, sometimes in person, sometimes with a paid customer directing the abuse over a webcam. The Virtual Global Task force observed in 2019 that the live-streaming of child sexual abuse was increasing in popularity among offenders, and the organization estimated it would become 15-times more prevalent online by 2021 and would account for 13 percent of all internet video traffic. 

“Child sex offenders are now able to pay for and direct the live sexual abuse of children while hidden in private homes and Internet cafes,” the VGT wrote. “As such, overseas perpetrators can request certain sexual acts to take place in advance of the abuse, or while its underway.”

 

 The live-streaming of child sexual abuse is growing in popularity among pedeophiles online.

(Image from Waldemar Brandt via Unsplash)


 

In 2018, 40 percent of children identified in reports from the Internet Watch Foundation were under the age of 10. 

Government leaders should have known this was coming. The writing has been on the wall for over two decades, yet laws relating to the control, removal and handling of this material online have remained grossly inadequate. As CSAM reports have exploded, so has government awareness, but no significant progress has been made to combat these horrific crimes.

The approval of a Canada-wide strategy to deal with CSAM in 2004, with changes to the Criminal Code — including the addition of sections in 2012 creating the offence of arranging sexual services from a child over the internet, and the offence of sharing someone’s explicit images without their consent — have given police more tools to hold perpetrators accountable. But law enforcement is only one part of any potential solution that could see CSAM finally fade from the internet. 

“I hesitate to use the cliche that we’re behind the 8-ball, but it’s kind of true,” says Detective Andrew Ullock, head of the Peel Regional Police Internet Child Exploitation (ICE) Unit. “It’s not enough for political and legal leaders to say ‘let the police handle it, let them do their investigations and arrest these guys’. Well, I have an office of seven people, and the population of Peel is [1.6] million, we can’t get them all.”

Police arrive on scene after a crime has happened. If society wants to protect children from online sexual exploitation, police can’t be expected to deal with the crimes after they have happened and prevent them from happening in the first place, without help from all the entities that have ignored their role in the rapid rise of online CSAM. 

The companies that host this material — whether it’s a chat application, social media site, or any other internet service provider — need to be taking proactive steps to effectively identify, remove and report this material to police, Ullock says. Right now, the majority of them are not doing that, leaning on legislative crutches like Section 230 of the Communications Decency Act (CDA) in the United States that stipulates internet service providers are not responsible for the content posted by their users. It may be an American law, but due to Canada’s signatory status on trade agreements like the United States-Mexico-Canada Agreement (USMCA) in 2018,  which has regulations relating to digital trade, we are obligated to follow it as well. 

“No Party shall adopt or maintain measures that treat a supplier or user of an interactive computer service as an information content provider in determining liability for harms related to information stored, processed, transmitted, distributed, or made available by the service,” Article 19.17 of the USMCA states. 

As a result, this has allowed social media sites and other service providers to remain wilfully blind and stay hands-off when it comes to dealing with potential CSAM on their platforms, Detective Ullock says.

Companies might still report potentially illegal material, as many of them do, but there remains a lack of heavily punitive law to force all private players to comply. In 2020, the overwhelming majority of the 21 million reports NCMEC received, over 20 million, came from Facebook, 546,000 from Google, nearly 21,000 from Dropbox, 15,000 from Discord, almost 145,000 from Snapchat, 22,000 from TikTok, and 65,000 from Twitter. 

The issue rests in the particular processes followed by each of these apps or websites. For example, not all companies report the material within the same timeframes; many have delays in removing the harmful content, and every day it is available on the platform is another day of trauma and victimization for the child exploited in the image or video. Many platforms do not require a person to provide real identification to use the app, simply an email address and a name, making it extremely difficult to track down those who post this material — something that is becoming increasingly difficult as more people are using virtual private network (VPN) services to help them remain anonymous online. 

“Unless they (tech companies) are willing to get onboard in a more proactive way, we’re never going to get ahead of it, they are the keepers of the medium,” Detective Ullock says. “We don’t have the authority to police that medium, we are only reacting.”

 


 

Peel’s ICE Unit resembles David staring down Goliath. 

Detective Ullock’s team of seven officers work tirelessly to identify victims, triage hundreds of reports sent from NCMEC of potential crimes in Peel, comb through millions of images and videos every year — any single one of which could leave investigators with lasting mental scars — and, of course, they arrest offenders who are exploiting children online. 

In 2020, the unit laid 109 charges related to the online exploitation of children in Brampton and Mississauga, arresting 37 individuals.

Thanks to grant funding from the provincial government, Ullock was able to bring in an additional officer to help with the workload, which as a result of the pandemic, has been increasing and will potentially only get worse as it generally takes time for children to come forward to talk to their parents about abuse. 

Despite the additional officer, Peel’s 37 arrests were a drop from the 47 the unit tallied in 2019. With cases increasing rapidly and getting more complex, partly due to the constantly evolving technology criminals use, it is becoming difficult for such a small team to work through the mountains of material. 

The explosion in the number of new technologies, chat applications and social media sites has investigators constantly working to try and decode networks of pedophiles who connect on these apps and use them in different ways to share their collections. It is a vastly different way of exploiting children than it was just two decades ago, says Ullock, who spent time with Peel’s ICE unit as an investigator between 2010 and 2017, before taking over. 

“Years ago, most of the people we arrested were lone wolves, a lone player by themselves searching for child pornography on the Internet and obtaining it and creating these massive collections that we discover, then come along and arrest them,” Ullock explains. “Not to say that person still doesn’t exist, they do, but the way that people obtain child pornography has evolved and changed over time and a couple of things have allowed that to happen.”

Chat applications being one of them. They have allowed offenders to meet in these digital spaces, communicate and consume CSAM together; like a spontaneous organized crime ring. Except in these cases, everyone remains completely anonymous.

“In traditional organized crime, I know who you are, you know who I am and I know where you live, your history and background and vice versa and we actively plot together. Imagine a parking lot of a plaza where a whole bunch of gangsters who don’t know each other, show up together on the same day, say Monday morning at 10 a.m. and 50 of them show up at this parking lot and they say, ‘okay guys, we don’t know each other, but let’s all rob banks together, or let’s all sell drugs together. That would never happen,” Ullock says. “But that’s what happens with chat applications and social media — a whole bunch of people who have a sexual interest in children, are able to meet spontaneously in cyberspace and say, ‘hey, let’s commit child sex offences together, and let’s trade child porn, and let’s teach each other how to obtain it and not get caught.”

This ever-shifting network of individuals creates an incredibly difficult task for investigators, and like a game of whack-a-mole, whenever one is disrupted, another quickly emerges elsewhere, or users simply create a new profile with a different name. 

“They're able to organize themselves very, very quickly and even if the chat service discovers what’s going on and shuts it down...they just create a new profile and a new chat room and find each other again,” Ullock says. “Without question, they are more organized now than they were 10 to 12 years ago and social media has allowed that to happen.”

This ease of access and perceived low risk of penalty, has allowed the amount of CSAM online to proliferate at an alarming rate, forcing law enforcement tasked with gathering, analyzing and preparing evidence for trial, to sift through massive amounts of data. 

Last year, Peel’s ICE unit processed 22 terabytes of potentially illegal material, including 5.6 million images and 181,000 videos. This is an 83 percent increase from the 12 TB of data reviewed in 2019. 

These cases are also becoming much more time-consuming to litigate through the courts. The PRP ICE unit recently won a Supreme Court of Canada appeal in which an offender was attempting to have his case thrown out alleging he was entrapped by a Peel Police officer. 

“When you add it all up together an officer can spend up to a month just to litigate a single case, that takes away from their investigative efforts,” Ullock says. 

With few avenues for relief from the growing workload, police units across the country — all of which are grappling with many of the same struggles as Peel — will need lawmakers to find a way to hold companies accountable and institute requirements for more stringent detection and reporting methods, or the courts will need to begin holding these companies accountable for allowing this harmful content to live on their platforms in the first place. Harsh penalties could act as an incentive for companies to ensure they have the proper policies in place. Already, some of the world’s largest tech companies are finding the courts are no longer willing to accept they have no responsibility for the content on their platforms. 
 


 

The C3P has found that for every internet service provider or social media company, there is a varying degree of emphasis placed on rooting out CSAM on their platforms, and very few of the most popular websites have reporting functions for users to notify them about people sharing this harmful material.

In 2020, sites like Facebook and Twitter had no tools within posts for reporting users who were sharing CSAM. Apps like Instagram and TikTok had functions to report users, but no place to specifically mention they are posting CSAM. The lack of adequate reporting functions makes it difficult not only for people to report CSAM when it’s found, but prevents the company from getting any accurate picture of just how much could be filtering through the website undetected. Even the world’s largest pornographic websites like Pornhub and XVideos were found to have inadequate mechanisms for alerting moderators to harmful content. 

 

 

This has left many victims with no other option but to carry out the traumatic process of policing the internet for their own child sexual abuse material, as websites refuse to take full responsibility to monitor their own content. 

“These survivors have been instrumental in identifying serious deficiencies in reporting tools for the public on a number of platforms,” the C3P states. “Survivors surveyed by C3P have generally characterized their own experience reporting CSAM online as disheartening; exceedingly long delays in responding to their complaints, moderators challenging victims on the veracity of their report or, as is often the case, no response at all.”

Many survivors are now pushing back. 

Pornhub has faced significant backlash for failing to address the presence of CSAM and other non-consensual material posted to its platform. A lawsuit filed in June by 34 women against MindGeek — Pornhub’s parent company which is based in Canada — alleges the company is a criminal enterprise run “just like the Sopranos”, and that a small group of male executives, self-titled the “Bro Club” inside the company ran a number of schemes where they purchase bulk pornographic content from areas of the world known to be hotbeds for human trafficking, and posted this material on their website using a number of anonymous accounts. 

As the lawsuit explains: “Insiders familiar with this elaborate scheme left no doubt that the Bro Club understood this was trafficked content: ‘100% they knowingly paid real pimps. They would discuss how this cheap content was coming from old school pimps. They found it exciting. They would explain, ‘we don’t need to pay studios in the US, low paid pimps come to us.” MindGeek has denied all of the allegations in the lawsuit. 

A Canadian parliamentary committee earlier this year heard from a number of victims who shared their stories of trying to get sexual content that depicted them as minors, or was made without their consent, removed from Pornhub. The victims were either ignored, repeatedly asked for proof that it was them in the video (only to have the video remain on the website for months after), or if the video was taken down, it would reappear on the website days later, despite Pornhub’s claims that it has sophisticated “fingerprinting” software that can detect videos that have been flagged as harmful. 

Last week, a court in the United States ruled that a lawsuit against Twitter, filed by two male victims who allege the social media site monetized their CSAM and delayed removing it, can proceed. 

Typically, internet companies are shielded from legal liability by Section 230 of the Communications Decency Act, which says they are not responsible for the content posted by their users. However, thanks to the Fighting Online Sex Trafficking Act (FOSTA) approved by former president Donald Trump, it created an exemption related to sex trafficking laws. 

The two plaintiffs in the lawsuit were 13-years-old when they were tricked into sending explicit videos of themselves to a sex trafficker posing as a 16-year-old girl over Snapchat. When the two teens were in high school, the videos surfaced on Twitter, where they were viewed and retweeted thousands of times. 

It took nine days and a request from the Department of Homeland Security to have the videos removed, despite repeated requests and the videos being posted from an account that had previously been flagged to Twitter by another user as posting “obvious child porn”. 

Judge Joseph Spero described Twitter’s behaviour in the situation as “an ongoing pattern of conduct amounting to a tacit agreement with the perpetrators.”

The ruling has significant potential to establish precedent for the way future cases involving internet service providers and user-generated content, particularly CSAM and other trafficked material, will be handled, experts say.

“This historic ruling is the first breakthrough for an online trafficking survivor in any court where Twitter has alleged CDA immunity,” Peter Gentala, the senior legal counsel for the National Centre on Sexual Exploitation stated in a press release.

With increased accountability and stiffer penalties, it will force large companies to invest adequately in CSAM detection and removal, and mechanisms for preventing its sharing. 

Until that happens, police forces across the globe will need to continue to deal with an overwhelming workload.
 


 

Detective Ullock has introduced a number of new initiatives to make the work of Peel’s ICE unit more efficient and to better help survivors in the region. 

With millions of images being seized annually, all of which need to be analyzed by investigators to determine whether it meets the threshold of “child pornography” under the Canadian Criminal Code, by sheer numbers it is become almost humanly impossible to do it all. 

To assist in the categorization of this material, Detective Ullock has invested in a new artificial intelligence program that helps investigators by scanning images and flagging those that could potentially be CSAM. 

“This will ultimately help reduce the strain on organizational resources in decreasing the amount of time it takes officers to categorize the massive collections. Additionally it will mitigate the risk of vicarious trauma to investigators,” a recent report reads. 

Detective Ullock has also created the role of a Victim Identification Officer, whose sole task is leading the unit’s hunt for victims in Peel and ensuring they get the proper support. 

However, until the tech industry begins to take the issue of CSAM seriously, Peel’s ICE unit, and other units across the globe will only be chipping away at a problem that grows larger by the day. 

“I am proud of the work that this office does and admire the work that all of our contemporaries around the globe do, but the real answer is we can’t do it by ourselves. I think we are, without question, a very necessary part of the solution and I want us to keep doing what we’re doing, but we can’t do it alone,” Ullock says. “It kind of feels like we are doing it alone in a lot of ways.”

 

 


Email: [email protected]

Twitter: @JoeljWittnebel 


COVID-19 is impacting all Canadians. At a time when vital public information is needed by everyone, The Pointer has taken down our paywall on all stories relating to the pandemic and those of public interest to ensure every resident of Brampton and Mississauga has access to the facts. For those who are able, we encourage you to consider a subscription. This will help us report on important public interest issues the community needs to know about now more than ever. You can register for a 30-day free trial HERE. Thereafter, The Pointer will charge $10 a month and you can cancel any time right on the website. Thank you



Submit a correction about this story