U.S. Supreme Court Declines to Hear Bid to Sue Reddit over Child Porn
The U.S. Supreme Court on Tuesday declined to hear a bid by child pornography victims to overcome a legal shield for internet companies in a case involving a lawsuit accusing Reddit Inc of violating federal law by failing to rid the discussion website of this illegal content.
The Supreme Court on May 19 sidestepped an opportunity to narrow the scope of Section 230 immunity in a separate case.
Section 230 of the Communications Decency Act of 1996 protects “interactive computer services” by ensuring they cannot be treated as the “publisher or speaker” of information provided by users. The Reddit case explored the scope of a 2018 amendment to Section 230 called the Fight Online Sex Trafficking Act (FOSTA), which allows lawsuits against internet companies if the underlying claim involves child sex trafficking.
Reddit allows users to post content that is moderated by other users in forums called subreddits. The case centers on sexually explicit images and videos of children posted to such forums by users. The plaintiffs – the parents of minors and a former minor who were the subjects of the images – sued Reddit in 2021 in federal court in California, seeking monetary damages.
The plaintiffs accused Reddit of doing too little to remove or prevent child pornography and of financially benefiting from the illegal posts through advertising in violation of a federal child sex trafficking law.
The San Francisco-based 9th U.S. Circuit Court of Appeals in 2022 concluded that in order for the exception under FOSTA to apply, plaintiffs must show that an internet company “knowingly benefited” from the sex trafficking through its own conduct.
Instead, the 9th Circuit concluded, the allegations “suggest only that Reddit ‘turned a blind eye’ to the unlawful content posted on its platform, not that it actively participated in sex trafficking.”
Reddit said in court papers that it works hard to find and prevent the sharing of child sexual exploitation materials on its platform, giving all users the ability to flag posts and using dedicated teams to remove illegal content.
The Supreme Court on May 19 declined to rule on a bid to weaken Section 230 in a case seeking to hold Google LLC liable under a federal anti-terrorism law for allegedly recommending content by the Islamic State militant group to users of its YouTube video-sharing service. Google and YouTube are part of Alphabet Inc.
Calls have come from across the ideological and political spectrum – including Democratic President Joe Biden and his Republican predecessor Donald Trump – for a rethink of Section 230 to ensure that companies can be held accountable for content on their platforms.
“Child pornography is the root cause of much of the sex trafficking that occurs in the world today, and it is primarily traded on the internet, through websites that claim immunity” under Section 230, the plaintiffs said in their appeal to the Supreme Court.
Allowing the 9th Circuit’s decision to stand, they added, “would immunize a huge class of violators who play a role in the victimization of children.”
- Jury Awards $80M to 3 Former Zurich NA Employees for Wrongful Termination
- Growing Progressive Set to Hire 10,000 for Claims, IT, Other Roles
- Chubb CEO Greenberg: Some Financial Lines Underwriting Practices ‘Simply Dumb’
- Insurers Get Green Light to Pay Less Than Billed Charges in Florida PIP Cases
- Millions of Recalled Hyundai and Kia Vehicles, With Dangerous Defect, Remain on Road
- Poll: Consumers OK with AI in P/C Insurance, but Not So Much for Claims and Underwriting
- Apollo Accused in Lawsuit of Illegal Human Life Wagering Scheme
- California Chiropractor Sentenced to 54 Years for $150M Workers’ Comp Scheme