In a recent development, the US Supreme Court has rejected a plea for hearing an appeal by a group of women accusing Reddit of benefiting from child pornography posts. The case, known as Jane Does 1-6 et al. v. Reddit, Inc., brought into question whether the social media platform had intentionally participated in sex trafficking. The women’s case originated from a class action lawsuit lodged by a woman named “Jane Doe” against Reddit, alleging that her ex-boyfriend had posted explicit images of her from her childhood without her consent, and that Reddit had failed to adequately address her numerous reports about these posts.
The plaintiffs were confronted with further distress as the explicit images were continuously reposted on the platform. Despite Doe’s vigilant monitoring of 36 subreddits and multiple reports, Reddit allegedly did little to assist. The lawsuit highlighted that the offensive content was uploaded from the same IP address, which Reddit was aware of.
Nonetheless, the district court ruled in Reddit’s favor, citing Section 230 of the 1996 Communications Decency Act, a federal law that provides protection to websites against liability for user-published content that may be defamatory. Following this, the 9th U.S. Circuit Court of Appeals, in a unanimous decision, upheld the previous ruling in favor of Reddit. The judge pointed out that the plaintiffs failed to establish a link between the child pornography posted on Reddit and the revenue the company generated, except for Reddit earning advertising revenue from popular subreddits.
Despite Reddit’s recent policy changes against sharing explicit content like revenge porn and attempts to improve efforts in removing reported content, the National Center on Sexual Exploitation (NCOSE) included Reddit in its “Dirty Dozen List,” a compilation of businesses that profit from or facilitate sexually exploitative practices. NCOSE argues that Reddit’s policies have not been effective in proactive prevention or removal of abuse.
This case is not isolated to Reddit. Twitter has also been sued over accusations of users posting child pornography. Earlier this month, a lawsuit alleging Twitter profited from sex trafficking by allowing tweets containing explicit images of two 13-year-old boys was dismissed by the 9th Circuit. The court ruled in favor of Twitter, referencing the same Section 230 of the Communications Decency Act.