Facebook Isn’t Responsible as Terrorist Platform, Court Says
Facebook Inc. doesn’t have to face a lawsuit by victims of Hamas attacks and their relatives who claimed that the social network unlawfully assisted the terror group, a federal appeals court ruled.
In a 66-page ruling issued Wednesday, a divided court upheld a judge’s decision to throw out the case, saying an interactive computer service is not the publisher of third-party information when it uses tools that are designed to match content with consumer interests.
“Facebook does not edit (or suggest edits) for the content that its users — including Hamas — publish,” the Second Circuit Court of Appeals said, noting that the company only requires users to provide basic information and therefore acts as a “neutral intermediary.”
The lawsuit was among several around the U.S. testing whether victims of terrorist attacks and their families can hold social-media companies to account for allowing violent extremists to use their platforms to recruit followers. The terrorism victims attempted for the first time to argue that social-media companies could be held liable under the U.S. Anti-Terrorism Act.
The plaintiffs sued the company in federal court in New York in 2016, alleging it provided Hamas with a communications platform that enabled the attacks. A district judge dismissed the case in 2017, finding that the Communications Decency Act of 1996 prevents civil liability for claims that treat computer service providers or users as the publisher or speaker of information provided by someone else.
The appeals court said that if Facebook were even partly a creator or developer of the terrorism-related content, it wouldn’t be protected under the law. But it disagreed with the plaintiffs’ arguments that the company helped developed Hamas’s content by directing it to users who are interested in the group and its terrorist activities even if they aren’t seeking it.
“The algorithms take the information provided by Facebook users and ‘match’ it to other users — again, materially unaltered — based on objective factors applicable to any content, whether it concerns soccer, Picasso or plumbers,” the court wrote.
One of three judges on the panel, Robert A. Katzmann, partly disagreed in a 35-page dissent, saying the decision takes a law intended to encourage service providers to shield minors from obscene material “so that it now immunizes those same providers for allegedly connecting terrorists to one another.”
“Moreover, in part through its use of friend, group and event suggestions, Facebook is doing more than just publishing content: it is proactively creating networks of people,” Katzmann wrote. “Its algorithms forge real-world (if digital) connections through friend and group suggestions, and they attempt to create similar connections in the physical world through event suggestions. The cumulative effect of recommending several friends, or several groups or events, has an impact greater than the sum of each suggestion.”
The case is Force v. Facebook, 18-397, U.S. Circuit Court of Appeals, Second Circuit (Manhattan).
- Report: Millions of Properties May be Underinsured Due to Multiple Undetected Structures
- AccuWeather’s 2024 White Christmas Forecast Calls for Snow in More Areas
- Sedgwick Eyes Trends and Risks in 2025 Forecast
- Uber Warns NYC Response to Insolvent Insurer Exposes Drivers