Twitter refused to take down widely shared pornographic images and videos of a teenage sex trafficking victim because an investigation “didn’t find a violation” of the company’s “policies,” a scathing lawsuit alleges.
The federal suit, filed Wednesday by the victim and his mother in the Northern District of California, alleges Twitter made money off the clips, which showed a 13-year-old engaged in sex acts and are a form of child sexual abuse material, or child porn, the suit states.
The teen — who is now 17 and lives in Florida — is identified only as John Doe and was between 13 and 14 years old when sex traffickers, posing as a 16-year-old female classmate, started chatting with him on Snapchat, the suit alleges.
Doe and the traffickers allegedly exchanged nude photos before the conversation turned to blackmail: If the teen didn’t share more sexually graphic photos and videos, the explicit material he’d already sent would be shared with his “parents, coach, pastor” and others, the suit states.
Doe, acting under duress, initially complied and sent videos of himself performing sex acts and was also told to include another child in his videos, which he did, the suit claims.
Eventually, Doe blocked the traffickers and they stopped harassing him, but at some point in 2019, the videos surfaced on Twitter under two accounts that were known to share child sexual abuse material, court papers allege.
Over the next month, the videos would be reported to Twitter at least three times — first on Dec. 25, 2019 — but the tech giant failed to do anything about it until a federal law enforcement officer got involved, the suit states.
Doe became aware of the tweets in January 2020 because they’d been viewed widely by his classmates, which subjected him to “teasing, harassment, vicious bullying” and led him to become “suicidal,” court records show.
While Doe’s parents contacted the school and made police reports, he filed a complaint with Twitter, saying there were two tweets depicting child pornography of himself and they needed to be removed because they were illegal, harmful and were in violation of the site’s policies.
A support agent followed up and asked for a copy of Doe’s ID so they could prove it was him and after the teen complied, there was no response for a week, the family claims.
Click here to read more.
SOURCE: New York Post, Gabrielle Fonrouge