WASHINGTON — The Biden administration told the U.S. Supreme Court that social media companies in some cases can be held liable for promoting harmful speech, partially siding with a family seeking to sue Alphabet Inc.’s Google over a terrorist attack.
In a Supreme Court filing on Wednesday night, the Justice Department argued that social media websites should be held responsible for some of the ways their algorithms decide what content to put in front of users.
The case, likely to be argued early next year, revolves around the family of Nohemi Gonzalez, a 23-year-old U.S. citizen who was killed by ISIS in Paris in November 2015. Her family is arguing that YouTube, which Google owns, violated the Anti-Terrorism Act because its algorithms recommended ISIS-related content.
The Justice Department did not outright side with Gonzalez. Instead, the government argued that the family should get another crack before a federal appeals court that tossed out the complaint against Google. The government said social media companies shouldn’t be held liable simply for allowing content to be posted or for failing to remove it.
The case could narrow the country’s interpretation of Section 230 of the Communications Decency Act, the tech industry’s prized liability shield that protects social media platforms from being held liable for content generated by users.
“The statute does not bar claims based on YouTube’s alleged targeted recommendations of ISIS content,” wrote acting U.S. Solicitor General Brian Fletcher.
A coalition of 26 states and Washington, D.C. also filed on behalf of Gonzalez in the case, arguing that courts have encouraged an overly broad interpretation of Section 230. They claimed the statute currently holds them back from enforcing state laws when criminals operate online.
Congress has long debated whether to reform Section 230, which was originally passed in 1996 before the modern internet came to dominate everyday life. Lawmakers on both sides of the aisle have argued that the sweeping immunity has enabled the social media companies to make editorial decisions affecting billions of people without consequences. But Congress has struggled to create and pass bipartisan legislation on the issue, leaving the question of online speech to the courts.
Most of the Supreme Court justices have not made any public statements about their views on Section 230 – except Justice Clarence Thomas, who last year said the court should consider treating social media companies like public utilities. That would enable the government to create a much more aggressive regulatory regime around companies like Meta Platforms Inc., Twitter Inc. and YouTube.
The Google v. Gonzalez case has already attracted attention from some senators on Capitol Hill. Republican Senators Ted Cruz of Texas and Josh Hawley of Missouri submitted briefs in support of reforming Section 230, which has long faced the ire of conservatives hoping to punish the social media companies for allegedly censoring conservative content.
Google has argued that narrowing Section 230 could make it harder for them, and other social media platforms, to remove terrorist content.
“Through the years, YouTube has invested in technology, teams, and policies to identify and remove extremist content,” said Google spokesman José Castañeda. “We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices. Undercutting Section 230 would make it harder, not easier, to combat harmful content — making the internet less safe and less helpful for all of us.”
The Justice Department sided with Twitter and Google in a separate Supreme Court case involving social media this week. At issue in Twitter v. Mehier Taamneh is whether Twitter violated the Anti-Terrorism Act by failing to enforce policies against pro-terrorist content on its platform. Fletcher argued in a filing on Tuesday night that Taamneh’s family had failed to prove that Twitter was intentionally “aiding and abetting” terrorism.
The cases are Gonzalez v. Google, 21-1333 and Twitter v. Taamneh, 21-1496.