‘This is a case about rape, not pornography’: Pornhub nightmare exposes internet’s blindspot on child sexual exploitation
Feature image from Sergey Zolkin via Unsplash

‘This is a case about rape, not pornography’: Pornhub nightmare exposes internet’s blindspot on child sexual exploitation


Content warning: This story describes instances of sexual abuse and child exploitation that some readers may find disturbing.


 

A young woman whose life was torn apart by a personal video shared incessantly on Pornhub is fighting back against the website that did little to stop her ongoing trauma. 

Serena Fleites, now 19, shared her story with the Canadian Parliamentary Standing Committee on Access to Information, Privacy and Ethics (ETHI) in February as politicians looked for possible solutions to the problem of rapidly proliferating non-consensual sexual content and child sexual abuse material (CSAM) — child pornography — on websites like Pornhub, which is owned by its Canadian parent company.

The majority of the 11-member committee was shocked by what they heard from Fleites and a number of other victims who shared their stories of videotaped abuse that was then shared online without their consent. Their desperate efforts to have the traumatic material taken down were ineffectual, even when the content qualified as child pornography under the Criminal Code definition. 

 

Serena Fleites told MPs in February about how an intimate video of her posted on Pornhub while she was underage destroyed her life.

(Screengrab from ParlVu)

 

“I have never seen a situation where there was so much disregard and indifference to what was obviously child pornography, rape, trafficking content, illegal content on this site,” said Fleites’s lawyer Michael Bowe during his testimony earlier this year. Bowe works with the Manhattan-based firm Brown Rudnick. “I have no question under American law there are criminal violations here.”

Bowe is now representing Fleites and 33 other women — identified only as Jane Does — in a civil lawsuit that levels numerous disturbing allegations against Pornhub and its parent company MindGeek, including claims that the company “frequently purchased in bulk trafficked content from known trafficking areas such as Eastern Europe, Asia, and South America” and “used its byzantine international corporate structure of hundreds of sham shell corporations to mask the process and launder the payments.”

“This is a case about rape, not pornography. It is a case about the rape and sexual exploitation of children. It is a case about the rape and sexual exploitation of men and women. And it is a case about each of these defendants knowingly and intentionally electing to capitalize and profit from the horrendous exploitation and abuse of tens of thousands of other human beings so they could make more than the enormous sums of money they would have otherwise made anyway,” the lawsuit states.

It describes how MindGeek’s corporate operations were controlled by a select number of male executives, self-titled “the Bro Club”. This group controlled the company’s operations, were threatening and hostile to those they viewed as potential “snitches”,  intimidated and doxed members of the press and advocates who spoke out against the company, and purchased vast stores of content from “100% real pimps” across the globe and visited “production sites”, insiders allege in the lawsuit.

“In one such visit, MindGeek executives witnessed a football-field size warehouse in which women were crammed into adjoining studio stalls like livestock to perform on camera,” the lawsuit alleges. “Many of the women appeared young and were engaged in scenes depicting underage girls.”

The company is a classic criminal enterprise, run “just like The Sopranos”, the lawsuit alleges.

 

Fleites’s lawyer Michael Bowe.

(Screengrab from ParlVu)

 

Also named in the lawsuit, filed in June in the U.S. District Court for the Central District Court of California, is credit card giant Visa, which it claims continued to provide services, and profited from the alleged trafficked content appearing on MindGeek platforms, even after repeated public scandals made it increasingly obvious that scores of the website’s content was illegal. The lawsuit states that with all of the evidence in the public eye, even the “densest inquisitor” would have been able to conclude that something was wrong with the platform. 

“The credit card companies and their members providing merchant banking to MindGeek were not uniquely incapable of understanding all of this. To the contrary, they were uniquely capable and in the best position to understand this. And they did understand this. They simply chose to do business with MindGeek and benefit from its trafficking venture nevertheless,” the lawsuit states.

In a statement released hours after the lawsuit was filed, MindGeek strongly denied the claims.

“The allegations in today’s complaint that Pornhub is a criminal enterprise that traffics women and is run like 'The Sopranos' are utterly absurd, completely reckless and categorically false,” the statement reads. “Pornhub has zero tolerance for illegal content and investigates any complaint or allegation made about content on our platforms.” 

Testimony from experts and evidence provided to the ETHI committee have raised serious doubts about MindGeek’s claims surrounding content moderation and its CSAM reporting practices. 

The controversy surrounding Pornhub has also reached other online platforms that provide a space to share adult content. XVideos, one of Pornhub’s biggest competitors, is now facing a class action lawsuit filed by the National Centre on Sexual Exploitation. The saga has exposed just how hard it has become to control harmful material online, and the failure of governments and the private sector to do anything at all about it. 

Between 2015 and 2018, the RCMP’s National Child Exploitation Crime Centre (NCECC) saw a 566 percent increase in the number of CSAM reports. In the United States, the National Centre for Missing and Exploited Children (NCMEC) has seen reports of CSAM increase from 16.9 million in 2019 to 21.7 million in 2020. While the reports of deplorable child sexual abuse material have consistently increased every year, the regulations and reporting mechanisms meant to prevent this criminal material from being produced and shared have remained the same, allowing CSAM to proliferate and infest many of the world’s most popular websites. According to NCMEC, 95 percent of the 21.7 million reports it received of harmful content, mostly CSAM, in 2020, came from Facebook. 

After a number of meetings filled with emotionally gutting testimony, Canada’s ETHI committee has finalized a list of recommendations that could place more power in the hands of law enforcement and increase legal obligations of private sector internet service providers to detect, locate and report this material on their platforms, but also prevent it from getting there in the first place.

Similar recommendations have been circling for years. Will the government finally listen?

 


 

Prior to Pornhub’s top executives appearing before Canada’s ETHI committee earlier this year, the stories swirling around their flagship website had outraged victims and advocates across the globe. 

In October 2019, a 15-year-old girl in Florida was found after being missing for nearly a year when her mother made the horrifying discovery of videos of her daughter posted to Pornhub. 

In January 2020, the company behind GirlsDoPorn, a partner channel on Pornhub, was ordered to pay $13 million to 22 women who claimed the company tricked them into doing pornography. The GirlsDoPorn scheme saw women being offered jobs in modelling or similar work but when they arrived on location, they were forced into contracts and that required them to do sexual acts. Many of the victims were told the videos would never be made public, only to discover them published on the most popular porn website in the world. In December 2020, one of the male performers and producers of the videos, Ruben Andre Garcia, pleaded guilty to sex trafficking charges, admitting he conspired with the owners of the company to trick women into appearing in the videos. Garcia was sentenced to 20 years in prison in June of this year. The FBI is currently offering a $10,000-reward to find the owner of the website, who fled when the charges were introduced in late 2019. Forty victims of the GirlsDoPorn channel are also suing Pornhub. 

In addition to the 34 women involved in the most recent lawsuit, widespread reports have emerged, detailing allegations from women claiming non-consensual videos, or videos showing them as a minor, were nearly impossible to remove from the website, while MindGeek staffers routinely ignored emails or said the videos would be removed, only to have them remain on the website for months afterward. Often, if eventually taken down, they would reappear shortly after. It’s a problem not isolated to MindGeek, according to reporting by the Canadian Centre for Child Protection (C3P), which has highlighted many of the internet’s largest service providers as being part of these increasingly predatory online practices.

Amidst the pandemic last year, advocate Laila Mickelwait launched the Trafficking Hub campaign. It not only created further awareness of the disturbing content on Pornhub, but advocated to get the website shutdown for good. It also aims to hold its executives accountable for allowing the harm to continue for years. 

“Attention is being brought to this problem and that is a positive step in the right direction because this has been allowed to go unchecked for too long,” Mickelwait told The Pointer in June 2020. “It’s time to get things back in order and to hold the company accountable.”

 

Founder of the Trafficking Hub campaign Laila Mickelwait.

(Screengrab from ParlVu)

 

In December 2020, a New York Times article featuring Fleites’s tragic story, increased the scrutiny on Pornhub exponentially, with many asking why the federal government would allow a Canadian company — MindGeek’s headquarters are in Montreal — to operate in such a manner. 

The ETHI committee began receiving testimony a few months later, hearing first from Fleites, then from MindGeek’s top executives, marking one of their first public appearances to speak on behalf of the company and its role in the ongoing horror of child sexual abuse and the trafficking of women to generate online profits. 

Company CEO Feras Antoon and COO David Tassillo appeared calm and offered reassurances to the politicians that the company was doing all it could to prevent harmful content from appearing on their platform. They claimed their security mechanisms are more technologically advanced than many other large service providers. 

“While we have remained steadfast in our commitment to protect our users and the public, we recognize that we could have done more in the past, and have to do more in the future,” Antoon said on February 5. “Even a single unlawful or consensual image on MindGeek’s platforms is one too many. Full stop.”

Evidence provided to the ETHI committee and testimony from leaders of North American’s largest child protection agencies cast considerable doubt on these claims, showing some to be outright falsehoods. Canada’s privacy commissioner Daniel Therrien told MPs in May that his office was currently investigating Pornhub and potential privacy violations after hearing the reports of women who allege the website refused to take down illegal content. A spokesperson told The Pointer in July that the investigation was still ongoing. 

MindGeek executives testified under oath that all instances of CSAM found on Pornhub are reported to the proper authorities. However, both the C3P in Canada and the NCMEC in the United States told the committee they only began receiving reports of suspected CSAM from Pornhub in late 2020 — when the New York Times exposé increased scrutiny of the company. 

The pair also testified under oath that every video that is published on the platform is reviewed by human moderators to ensure it abides by the website’s Terms of Service. 

“It is a lie,” a whistleblower quoted in the recent lawsuit states.

 

Pornhub's top executives (from left to right): Vice-President Corey Urman, CEO Feras Antoon, and COO David Tassillo.

(Screengrabs from ParlVu)

 

The ongoing discussion about Pornhub has become a microcosm of the larger issue of how to control illegal content on the internet. 

Currently, the onus falls on MindGeek and other large tech companies to monitor and report illegal content on their platforms to law enforcement. Under the Mandatory Reporting Act in Canada these companies have a legal obligation to report any suspected CSAM to police. The largest conundrum for many of these tech companies is the vast number of users on their platforms, which can make monitoring the sheer amount of posted content next to impossible without the proper technological assistance. 

During his testimony in February, Antoon explained there are different levels of responsibility when it comes to controlling illegal content, the first of which lies with the uploader.

“We have been leading this fight by being more vigilant in our moderation than almost any other platform, both within and outside the adult space,” Antoon told the committee. 

The recent lawsuit alleges that the majority of the illegal content posted on the website was in fact put their by MindGeek, not random users, through a system that involved various schemes, including “ripping” pornographic content from DVDs and uploading it to the site through fake profiles, and purchasing material in bulk that included CSAM, trafficked women and non-consensual acts from hotbeds of human trafficking in Eastern Europe and Asia. 

As the lawsuit explains: “Insiders familiar with this elaborate scheme left no doubt that the Bro Club understood this was trafficked content: ‘100% they knowingly paid real pimps. They would discuss how this cheap content was coming from old school pimps. They found it exciting. They would explain, ‘we don’t need to pay studios in the US, low paid pimps come to us.’”

The whistleblower explained that while visiting one of these video farms, a Pornhub executive asked the producer where the women came from and where they lived. 

“The producer unapologetically explained that his company had agents that scoured Eastern Europe for women who they recruited with promises of lucrative modelling jobs that would allow them to go to college and otherwise have a better life. When those women agreed they were transported to dormitory style housing or apartments and matched with a ‘boyfriend’ who would groom them for porn.”

When the women would try to leave, they were told the videos would be sent to their families and released publicly if they did not continue performing. 

“Were we planning any efforts to stop that? Absolutely not. Because of views,” states one insider quoted in the lawsuit. “Every time you put an extra layer of control on who watches, you lose content. And it[’]s the same thing, in this case, if you put an extra layer of control on what content goes up, you lose content. And content in this case is more pages, and more pages is more results, more results is more paid views.”

During his initial testimony, before the lawsuit was filed, Bowe explained this idea that “content is king” lies at the heart of the MindGeek business model. It helped its executives and owners create the largest, and perhaps wealthiest, pornographic website in the world. 

The Ontario Court of Justice has repeatedly recognized the damage done to the individual, especially children, when this type of content is shared widely online. The most insidious impact is the repeated downloading and sharing of this material, making it almost impossible to completely erase from the internet. 

“The degradation of these children becomes both permanent and global,” the OCJ has stated. 

Following the New York Times exposé, Pornhub moved to adjust its platform restrictions, eliminating the ability for anyone to download content, started allowing only verified users to share content on the site and removed millions of videos posted by unverified accounts. This was only done after the negative attention as a result of advocacy and media reporting. For years, despite the desperate pleas of victims, Pornhub ignored taking action that was only taken after the negative publicity, which could hurt its profits.

During his testimony, Tassillo explained that any illegal content found on the platform is not deleted, in order to assist with any future law enforcement investigation, but it is also “fingerprinted” so that it can be flagged if a user attempts to upload the content a subsequent time.

However, according to the lawsuit, MindGeek is responsible for much of the reappearance of harmful videos on its website.

“One eyewitness to this process, explained that caches of disabled content ‘would be provided to employees on disks and they would be instructed to reupload those videos from non-MindGeek computers using specific email addresses that would allow the uploads to bypass MindGeek’s purported ‘fingerprinting’ of removed videos. . . . They say they kept the stuff on the servers to cooperate with authorities but it was really so they could reupload.”

The 34 women suing the company each have their own stomach-turning story of abuse. Jane Doe #1 was 7-years-old when the abuse started. She was raped, trafficked and exploited by a “ring of Hollywood men and New York financiers,” including Jeffrey Epstein. The abuse lasted almost 15 years. Jane Doe #16 responded to an ad for a modelling job, but was then coerced into sex and raped. Jane Doe #17 was raped by two men who videotaped it and put it on Pornhub. When she explicitly told the website the video was of her rape, Pornhub said it would be removed. It remained on the website for another 648 days. Jane Doe #20 was trafficked and coerced into performing sexual acts on camera, usually while she was drugged. In one instance she was gang-raped by nearly 20 men. The videos were monetized on MindGeek platforms. 

“The abuse and pain from the assaults and sexual exploitation was so extreme that Jane Doe No. 20 ended up in the emergency room from a nervous breakdown. Years later, she continues to suffer from anxiety, panic attacks, and contemplated suicide,” the lawsuit states. Similar outcomes are explained for many of the victims in the lawsuit. 

Many of the Jane Does were underage when they were abused, or the material was created without their consent. Several were the victims of high-profile traffickers and abusers, including Epstein in New York, Derek Hay in California;  Abdul Hasib Elahi, in the United Kingdom; Kamonsak Chanthasing, in Thailand — all of whom are known to have preyed on multiple victims. 

The stories differ, but the abusive material’s appearance on Pornhub and the mostly ineffectual fight to get it removed is very much the same. 

 

John Clark, the president of NCMEC and Lianna McDonald, the executive director of C3P, testified that MindGeek only began reporting suspected CSAM in 2020, despite a legal obligation to do so existing since 2011.

(Screengrabs from ParlVu)

 

It highlights the clear need for a system that better controls this material when a user attempts to publish it online, and improved methods for holding companies responsible when this illegal content is allowed to live on their platforms. 

The need for tougher restrictions and enforcement mechanisms is something the C3P has been recommending for years. 

“We’ve been screaming from the rooftops that we are long overdue for regulation,” said Lianna McDonald, the executive director of C3P, during her testimony earlier this year. “Canada needs to assume a leadership role in cleaning up the nightmare that is resulted from an online world that is lacking any regulatory and legal oversight.”

 


 

“The internet is not a place that is governed. People talk about it being the wild west, it’s not the wild west. The wild west had rules. It may be questionable who was enforcing the rules, but make no mistake there were rules in the wild west. This is like the Walking Dead. It’s chaos,” explains Det. Andrew Ullock, the head of Peel Regional Police’s Internet Child Exploitation Unit. “There are no rules anymore and there’s like these zombies walking around and someone who you think is your friend is really your enemy and just all that chaos. That’s what the internet is and people don’t understand that.”

It is this lack of regulation that has allowed platforms like MindGeek to become infested — whether by their own doing or not — with illegal content. 

The legal onus falls on these companies to report illegal material when they become aware of it, meaning the crime has already taken place. There is no proactive approach. There is very little in place right now that can prevent this material from appearing online. 

“This has created this whole culture of social media and service providers of being able to stay hands-off, allowing them to be wilfully blind,” Ullock explains. “Unless they are willing to get onboard in a more proactive way, we’re never going to get ahead of it, they are the keepers of the medium. We don’t have the authority to police that medium, we are only reacting.”

Ullock equates the current model of online policing to a town with no locks on the doors and windows of all the houses, while police try to stop break-and-enters by going to these homes after they have been robbed, then reporting the crime. 

“That’s never going to work,” he says. “Until you’re willing to start putting locks on the doors and locks on the windows and getting ahead of it, that’s how you’re going to get the number of break-ins down.”

Following weeks of testimony, the ETHI committee has released a list of recommendations for government and law enforcement that could potentially serve as new locks for the doors and windows of the internet. 

The final report from the ETHI committee concludes that its study has provided a window into the world of adult websites and “how their content moderation practices have failed to protect the privacy and reputation of individuals online,” the document states. 

The committee delivers 14 recommendations geared at improving the ability of law enforcement to be able to respond to reports of CSAM, clarifying mandatory reporting regulations, restrictions on the downloading and reuploading of adult content and a “legal framework” to place new obligations on internet service providers hosting pornographic content. 

This new framework would “compel” internet service providers to use available tools to “combat the flagrant and relentless reuploading of illegal content”; hire, train and effectively supervise staff to carry out moderation efforts; and maintain detailed records of user reports and responses that can be audited by police authorities. 

The committee also recommends the creation of “accessible mechanisms” to ensure illegal content can be removed quickly when it’s flagged and that victims be “given the benefit of the doubt with respect to the non-consensual nature of the content.”

Further, the committee wants to see the government develop regulations that ensure companies use a “robust process” when verifying the age of those in the videos on their platforms and proof of valid consent with “penalties severe enough to act as an effective deterrent.”

The recommendations, if taken up by the government, could also see the statute of limitations extended on the Mandatory Reporting Act from its current two years, and will make the National Child Exploitation and Coordination Centre (NCECC) the designated reporting authority. Currently, the Act states internet service providers report to local law enforcement, which can create a disjointed response with reports popping up across the country, as police forces fail to effectively coordinate any response. 

Many of the report’s recommendations do little more than request police agencies to enforce laws that already exist, with vague language that indicates an unwillingness from the committee to impose serious restrictions on the internet. 

For example, when it comes to the downloading and reuploading of adult content, the committee did not recommend an outright ban on the ability of adult websites to allow users to download content, but simply stated that the Government of Canada “hold accountable” websites that allow downloading and reuploading that erases the identity of the initial poster of the content.

“The Committee is of the view that the onus to protect individuals depicted in CSAM and non-consensual content from violations of their privacy and reputation online should lie with the platforms hosting that content. Canadians’ privacy rights and by extension, their safety and dignity, should outweigh any profit motives that such platforms may have,” the report concludes. “Content-hosting platforms have a duty to exercise a high degree of caution when determining whether content meets legal standards for pornography, including but not limited to the age and consent of all persons depicted.”

Detective Ullock believes that it will take a joint effort between government, police and the service providers in order to truly begin to push back against the volume of illegal content online. Simple solutions, like restrictions on downloading and requiring users who post to your website to be verified with government-issued identification, is one way he says things can be made difficult for those looking to share CSAM and other illegal content. 

“There are absolutely things we could do, realistic solutions, that we could make it incredibly difficult for people to commit sex offences against children over the Internet,” Ullock says. 

“We could get ahead of them, we could really make it difficult, but it requires people to understand the problem, admit that it exists, and have the willingness to solve it.”

The question of willingness is something that plagued the RCMP throughout the ETHI committee’s study. Throughout the process, numerous requests from politicians, Senators, survivors and advocates all called on the RCMP to investigate the claims against Pornhub, including its apparent violation of the Mandatory Reporting Act, which has been in place since 2011. 

“We expect Canada’s Royal Mounted Police to fulfil its mandate of combatting online child sexual abuse through proactive multi-jurisdictional investigations into every Canadian entity that contravenes our law,” reads a letter from 53 MPs and 20 Senators sent to the RCMP in March 2021. “It is absolutely unacceptable that a Canadian company continues to operate in Canada with seeming impunity to our laws.”

No investigation has been launched, with the RCMP stating for months that the request is under review. 

“We can’t comment on investigations or whether an incident is under investigation. The call for a criminal investigation into MindGeek or Pornhub is with the RCMP for review and any further required action,” an RCMP spokesperson told The Pointer in April. 

When contacted in July, the RCMP noted it had nothing further to add, referring to its previous statement. 

The recent U.S. lawsuit is a civil action, with damages to be determined at a later date, if successful. It remains unclear whether the MindGeek executives will face a criminal probe related to the deeply disturbing allegations. 

 

 


Email: [email protected]

Twitter: @JoeljWittnebel 


COVID-19 is impacting all Canadians. At a time when vital public information is needed by everyone, The Pointer has taken down our paywall on all stories relating to the pandemic and those of public interest to ensure every resident of Brampton and Mississauga has access to the facts. For those who are able, we encourage you to consider a subscription. This will help us report on important public interest issues the community needs to know about now more than ever. You can register for a 30-day free trial HERE. Thereafter, The Pointer will charge $10 a month and you can cancel any time right on the website. Thank you. 



Submit a correction about this story