Justices 'completely confused' during arguments in Section 230 case against Google that could reshape internet

FILE - The YouTube logo is seen displayed on a mobile phone screen with a Google logo in the background. (Idrees Abbas/SOPA Images/LightRocket via Getty Images)

The Supreme Court heard oral arguments Tuesday in the first of two cases this week that explore the extent of the legal protections given to tech companies that provide a platform for third-party users to publish content.

The case in question, Gonzalez v. Google, deals with recommendations and algorithms used by sites like YouTube that go further than simply allowing users to post content but arrange and promote the content to users in a certain way. YouTube and Google are facing litigation over allegedly recommending videos created by ISIS and used to recruit new members.

Section 230 of the Communications Decency Act says sites like YouTube, Google, Facebook and Twitter are immune to legal claims based on the content posted by their users. The justices and parties involved wrestled with whether the way in which that content is presented is itself a form of speech and whether it is via clear recommendations or an algorithm.

At times, the issues and arguments got a bit murky, pushing even the justices to the point of confusion.

SUPREME COURT TO WEIGH GOOGLE AND TWITTER INTERNET FREE SPEECH POLICIES

"I mean, we're a court. We really don't know about these things. You know, these are not like the nine greatest experts on the internet," Justice Elena Kagan said, drawing laughter from the room.

f54c930e-Justices

FILE - Justices of the US Supreme Court pose for their official photo at the Supreme Court in Washington, DC on Oct. 7, 2022. (OLIVIER DOULIERY/AFP via Getty Images)

Minutes earlier, Justice Samuel Alito told the plaintiff's attorney, Eric Schnapper, he was "completely confused by whatever argument you're making at the present time." Schnapper had been discussing how YouTube presents thumbnail images and links to different videos when providing search results or content that an algorithm believes a user might want. Schnapper argued that while the videos themselves are created by users, the thumbnails themselves are joint creations of the user and YouTube, piercing YouTube's Section 230 protection.

Later on, Google attorney Lisa Blatt dismissed that argument by noting that it was never part of the plaintiff's complaint in the case.

Justice Ketanji Brown Jackson, like Alito earlier, said at one point she was "thoroughly confused" by what was being argued because she believed the relevant issue was the scope of Section 230 immunity, not what might trigger liability. Schnapper noted that it comes down to how certain actions or practices are viewed in terms of being covered by the law.

"The contention we’re advancing is that a variety of things that we’re loosely characterizing as recommendations fall outside of the statute," Schnapper told Jackson, summing up a central focus of the day's arguments.

FIVE BIG CASES THE SUPREME COURT WILL HEAR OVER THE NEXT FOUR WEEKS

"I guess the question is how do you get yourself from a neutral algorithm to an aiding and abetting?" Justice Sonia Sotomayor asked at one point.

Schnapper's argument was that YouTube's use of an algorithm to present a list of videos that the algorithm selects is itself a form of speech on YouTube's part, separate from the content of the videos themselves.

U.S. Deputy Solicitor General Malcolm Stewart appeared to have a more limited approach. Stewart, like others during the day's arguments, brought up a hypothetical in which a person enters a bookstore and asks for a book on Roger Maris and for the clerk to direct them to a table where the book is located. The clerk's direction would be speech about the book, separate from any speech contained within the book.

Sotomayor, however, pushed back against the idea that the store — or, in this case, YouTube, or similar tech companies — should be held liable for such speech.

"You're creating a world of lawsuits," she warned.

Stewart countered that even if there were more lawsuits, those lawsuits would fail unless they allege that the company's speech violates some other law. If YouTube used a neutral algorithm that treats ISIS videos the same way it treats cat videos, it would be unlikely to violate antiterrorism laws.

If Google or another company were to outright say that they recommend a video, however, that would be unprotected speech that promotes content.

Blatt noted that the word "recommendation" is not on YouTube's site, so it is not engaging in an unprotected activity.

"[V]ideos don’t appear out of thin air," Chief Justice John Roberts countered. "They appear pursuant to the algorithm your clients have."

Blatt explained that all search engines take a degree of a user's information into account when presenting results, such as their location search history and language. For instance, she noted, someone searching for "football" in the U.S. would get different results than someone in Europe, where the term refers to soccer.

Blatt also argued that there is not even a difference if an algorithm is neutral or not because if, for example, there was an algorithm that was pro-ISIS, that would fall under an exception for criminal activity.

CLICK HERE TO GET THE FOX NEWS APP

"So, if you have material support in collusion with ISIS, that's excepted from the statute," she said.

The Supreme Court will again hear arguments over Section 230 on Wednesday when it hears Twitter v. Taamneh. That case explores whether Twitter (or other social media platforms) should be held liable for aiding ISIS by giving them a platform.

Read more on FOX News.