Twitter Faces Lawsuit Over Failure To Remove Child Porn From Platform

Twitter is once again in hot water, this time as a new lawsuit filed against the social media giant claims that the platform not only failed to remove pornographic images of an underage sex trafficking victim, but that it also failed to declare the offending content to be in violation of its policies.

Related: Twitter CEO Jack Dorsey Explains Company’s Decision To Ban President Donald Trump

As reported by the New York Post, the lawsuit was filed in the Northern District of California by a young victim who alleged that “Twitter made money off the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn.”

“The teen — who is now 17 and lives in Florida — is identified only as John Doe and was between 13 and 14 years old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat,” the lawsuit explained. “Doe and the traffickers allegedly exchanged nude photos before the conversation turned to blackmail: If the teen didn’t share more sexually graphic photos and videos, the explicit material he’d already sent would be shared with his ‘parents, coach, pastor’ and others”.

Related: Star Wars Actor Mark Hamill Calls For Twitter To Ban President Donald Trump

According to the suit, the victim eventually relented, sending explicit images and videos of himself to the traffickers. Soon thereafter, the victim was ordered to involve other children in the production of the illegal media, an order which he complied to under duress.

Though Doe put an end to the situation by blocking the traffickers, the suit alleges that “at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material.”

“Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved,” the suit stated. Court records show that Doe “became aware of the tweets in January 2020 because they’d been viewed widely by his classmates, which subjected him to ‘teasing, harassment, vicious bullying’ and led him to become ‘suicidal.’”

Related: Netflix’s “Cheer” Star Jerry Harris Indicted On Child Pornography And Sex Crime Charges

However, after filing an official report with law enforcement officers, the victim filed a complaint with Twitter itself, arguing that the tweets in question “needed to be removed because they were illegal, harmful and were in violation of the site’s policies.”

After receiving a single follow-up message from Twitter asking for the victim’s identification, Doe and his family allege that they did not receive a response from Twitter for a week, only to be told that the platform “wouldn’t be taking down the material, which had already racked up over 167,000 views and 2,223 retweets.”

“Thanks for reaching out. We’ve reviewed the content, and didn’t find a violation of our policies, so no action will be taken at this time,” the response is said to have read. “If you believe there’s a potential copyright infringement, please start a new report. If the content is hosted on a third-party website, you’ll need to contact that website’s support team to report it. Your safety is the most important thing, and if you believe you are in danger, we encourage you to contact your local authorities.”

Related: Funimation Voice Actor Chris Thurman Claims Responsibility For Sending Sexually Explicit Photos To Underage Fan

The suit further explains that the content was only removed after Doe’s mother, through a personal connection, was put in contact with the Department of Homeland Security, who had the videos successfully removed on January 30th.

“Only after this take-down demand from a federal agent did Twitter suspend the user accounts that were distributing the CSAM and report the CSAM to the National Center on Missing and Exploited Children,” Doe’s lawyers wrote in the suit. “This is directly in contrast to what their automated reply message and User Agreement state they will do to protect children.”

When reached for comment by the Post, a Twitter spokesperson told the news outlet that ““Twitter has zero-tolerance for any material that features or promotes child sexual exploitation.”

“We aggressively fight online child sexual abuse and have heavily invested in technology and tools to enforce our policy,” the spokesperson wrote. “Our dedicated teams work to stay ahead of bad-faith actors and to ensure we’re doing everything we can to remove content, facilitate investigations, and protect minors from harm — both on and offline.”

What do you make of this new lawsuit? Let us know your thoughts on social media or in the comments down below!

Exit mobile version