November 24, 2024
A pair of cases address possible limits of the protection social media platforms have from liability.

The Supreme Court recently heard oral arguments in a case that could fundamentally alter social media.

Gonzales v. Google, heard by the justices on Feb. 21, asks the highest court in America to determine if the longtime internet liability shield known as “Section 230” includes content that the platform’s algorithms recommend to users. And about how they present those recommendations.

HOW A SUPREME COURT RULING AGAINST GOOGLE COULD UPEND THE INTERNET AS WE KNOW IT

The case stems from the killing of then-23-year-old Nohemi Gonzalez, who was studying in Paris when she became the only American victim of a terrorist attack that claimed 129 other lives in the city. The Islamic State later took responsibility for the acts.

Back in the United States, Gonzalez’s family sued multiple tech platforms, accusing them of radicalizing users into terrorists by promoting pro-ISIS third-party content. Google’s YouTube video-sharing platform is the only defendant that remains, and that’s the case the Supreme Court heard oral arguments on. The case was paired with a similar suit, Twitter v. Taamneh, which was heard the next day.

Both cases address the possible limits of the protection social media platforms have from liability under Section 230 of the 1996 Telecommunications Decency Act. The law, now commonly shortened to “Section 230,” clarified liability for online sites hosting third-party content at a time when there was uncertainty about their legal responsibility. Legal precedent dealt with traditional publishers, including newspapers, and distributors such as bookstores.

But the online hosts were different in that they were not filtering user content before it was posted, like a traditional publisher did, but only after it was posted, if at all. At the time, CompuServe’s chat board did not moderate posts at all and, because of the precedent in liability law, was therefore not legally responsible for the content it hosted.

A rival service, Prodigy, wanted to take down potentially offensive posts from its users in order to make a family-friendly online environment, but it worried that doing so would trigger legal liability. That’s because, in the past, bookstores were not held liable if they didn’t know of illegal content in the materials they were selling but were on the hook if they did know about it and carried it for sale anyway. Moderating content seemed to be an admission that they knew what they were hosting.

In practice, Section 230 means that host sites cannot be sued for content posted by their users and that taking down any of that content will not trigger liability for the platform. Legal responsibility stays with the creator of the content, not the online host.

Those same issues are at play, but they now apply to small sites and social media platforms with billions of users. More than 500 hours of third-party content is posted to YouTube every minute — that’s 720,000 hours per day.

Google and other major social media platforms argue that the volume of hosting can only be maintained because of the legal protections Section 230 affords to them. Without it, the danger of legal expenses for a flood of litigation would cause sites to take down much more content (just to be safe) or to allow everything (so they could claim the old bookstores’ hands-off protections). That would make for an internet void of anything the least bit controversial or one polluted with violence, spam, and pornography, largely unusable for most people.

The plaintiff’s argument in the case has changed from its initial petition to the court. Legal counsel for the Gonzalez family at first presented the question of Section 230’s safe harbor including third-party content when it was algorithmically recommended by the host site, arguing that it should not. But at this week’s oral argument, the Gonzalez family’s lawyer Eric Schnapper concentrated more on whether the thumbnail links in YouTube’s “up next” suggestion for the next video constitute content created by the host, instead of a third party, making them ineligible for Section 230’s protections.

During oral arguments, multiple justices volunteered that they were “confused” about what argument Schnapper was trying to make during their exchange. Chief Justice John Roberts clarified that the YouTube algorithm was the same across the platform and that nothing special was employed in the recommending of the ISIS content. Several justices expressed concerns about the resulting economic upset if the court were to rule in favor of curtailing Section 230’s liability shield and exposing online platforms to increased litigation.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

The oral arguments went on for almost three hours on Feb. 21, an unusually long amount of time for the justices to spend on a case. More than 70 briefs were filed in the Supreme Court, where interested parties, including other social media platforms, think tanks, and advocacy groups, weighed in on the case.

The court’s ruling could have profound and widespread implications for social media platforms and their users. But depending on how the court decides the aspects of the related Twitter case and its intersection with the Anti-Terrorism Act’s provisions for liability, the Supreme Court may be able to sidestep weighing in on the parameters of Section 230 altogether. America’s highest court is expected to rule on both cases in June.

Leave a Reply