Curran Blog

By Ashley Curran

Curran Blog

Largely considered the “backbone of all social networks,” algorithms “exist to sort the massive volume of content posted every day and show each user the content they are most likely to engage with.”[1] Algorithms are responsible for showing users personalized content that platforms like Facebook, YouTube, or TikTok think they will like based on numerous factors.[2] However, the Supreme Court’s decision in Gonzalez v. Google could force online platforms to change the way they operate to avoid liability for the content promoted on their sites.

The Ninth Circuit consolidated appeals from the cases Gonzalez, Taamneh, and Clayborn, all of which stem from various terrorist attacks.[3] The plaintiffs in these cases brought actions against social media platforms, including Google.[4] The court dismissed most of the claims with prejudice, but the remaining claims concern whether social media platforms can be liable for promoting and recommending content to  their users, and in turn, helping terrorist groups like ISIS spread their message under Section 230(c)(1) of the Communications Decency Act (“Section 230”).[5] The Supreme Court granted certiorari and heard oral arguments on February 21, 2023.[6]

Section 230 of the Communications Decency Act

Arguably one of the most important pieces of legislation in internet history, “Section 230 was enacted in response to a problem that incipient online platforms were facing.”[7] Before its enactment, courts held “an online platform that passively hosted third-party content was not liable as a publisher if any of that content was defamatory, but that a platform would be liable as a publisher for all its third-party content if it exercised discretion to remove any third-party material.”[8] As a result, Congress enacted Section 230, which gives online platforms immunity for content posted by third parties, and lets the platforms remove content without facing liability.[9]

Gonzalez v. Google: Brief Background on the Gonzalez Plaintiff’s Claim

Ms. Gonzalez was a victim of the ISIS-led terrorist attacks in Paris on November 13, 2015 (“Paris Attacks”).[10] The attackers used social media to post links to ISIS recruitment videos.[11] The Gonzalez Plaintiff’s theory of the case was that Google used algorithms to “match and suggest content to users based upon their viewing history[,]” and in turn, Google recommended ISIS videos to users and “assist[ed] ISIS in spreading its message.”[12]

Google moved to dismiss the case, arguing that Section 230 protects it from liability for content posted by third parties.[13] Agreeing with Google, the Ninth Circuit concluded:

[t]he Gonzalez complaint is devoid of any allegations that Google specifically targeted ISIS content, or designed its website to encourage videos that further the terrorist group’s mission. Instead, the Gonzalez Plaintiffs’ allegations suggest that Google provided a neutral platform that did not specify or prompt the type of content to be submitted, nor determine particular types of content its algorithms would promote.[14]

The Ninth Circuit’s consolidated decision in Gonzalez conflicted with a previous decision in Twitter, Inc. v. Taamneh.[15] The court in Taamneh held social media platforms “could be liable for aiding and abetting an act of international terrorism” by failing to remove content posted by supporters of terrorist organizations.[16] The Supreme Court granted certiorari to resolve the conflict.

Potential Implications of the Gonzalez Decision

The Supreme Court heard oral arguments in Gonzalez on February 21, 2023.[17] If the Court rules for the plaintiffs, online platforms could be held liable for promoting harmful content.[18] In turn, this could lead to “a more censored internet” and an overall change in the way online platforms operate, since “algorithm-generated recommendations . . . are employed in almost every instance of internet usage.”[19] Ultimately, “[a] ruling against Google will likely leave internet companies—from social media platforms to travel websites to online marketplaces—scrambling to reconfigure their businesses to avoid costly lawsuits.”[20] On the other hand, this decision could make online platforms take more measures to remove or moderate harmful content, making the internet safer overall. The outcome of the case is still uncertain, but it is sure to be a closely watched decision.

About the Author

Ashley is a second-year law student at Widener University Delaware Law School and a Staff Editor on the Delaware Journal of Corporate Law. She received her bachelor’s degree in history from the University of Delaware. While in law school, Ashley has worked as a Legal Methods I teaching assistant to Professor N.E. Millar, and she is currently a certified legal intern at Widener’s Veterans Law Clinic, where she helps low-income veterans appeal adverse VA decisions. Ashley is also a law clerk at Dalton & Associates in Wilmington, DE. She plans to take both the Pennsylvania and Delaware bar exams and practice law in Philadelphia upon graduation.

[1] Christina Newberry et al., Social Media Algorithms: A 2023 Guide for Every Network, Hootsuite (Nov. 7, 2022),; see also Edward Longe, The Future of the Internet Heads to SCOTUS, The James Madison Inst. (Feb. 13, 2023), (explaining that an algorithm “is a set of rules and signals that automatically ranks content on a social platform based on how likely each individual social media user is to like it and interact with it”).

[2] Newberry et al., supra note 1.

[3] Gonzalez v. Google LLC, 2 F.4th 871, 880 (9th Cir. 2021), cert. granted, 143 S. Ct. 80 (2022), cert. granted sub nom. Twitter, Inc. v. Taamneh, 143 S. Ct. 81 (2022).

[4] Id.

[5] Id. See also Gonzalez v. Google, SCOTUSBlog, (last visited Apr. 2, 2023).

[6] Gonzalez, 2 F.4th at 880.

[7] Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996, U.S. Dep’t. of Just. Archives, (last visited Apr. 2, 2023); see also 47 U.S.C.A. § 230 (West 2018); Ian Millhiser, The Supreme Court appears worried it could break the internet, Vox, (Feb. 21, 2023, 3:28 PM),

[8] Department of Justice’s Review of Section 230 of the Communications Decency Act of 1996, supra note 7.

[9] Id.

[10] Gonzalez, 2 F.4th at 880.

[11] Id. at 881. The Plaintiff’s complaint also alleged that “Youtube has become an essential and integral part of ISIS’s program of terrorism, and that ISIS uses YouTube to recruit members, plan terrorist attacks, issue terrorist threats, instill fear, and intimidate civilian populations.” Id.

[12] Id.

[13] Id. at 882. The Plaintiff’s complaint also alleged “that Google is directly liable under § 2333(a) [of the Antiterrorism Act of 1990 (ATA)] for providing material support and resources to ISIS, and for concealing this support, in violation of 18 U.S.C. §§ 2339A, 2339B(a)(1), and 2339C(c).” Id. For the purposes of this article, only the claims associated with § 230 of the Communications Decency Act are discussed.

[14] Id. at 895.

[15] See generally Taamneh v. Twitter, Inc., 343 F. Supp. 3d 904 (N.D. Cal. 2018), rev’d and remanded sub nom. Gonzalez v. Google LLC, 2 F.4th 871 (9th Cir. 2021).

[16] Longe, supra note 1 (emphasis added) (internal quotation marks omitted).

[17] Id.

[18] Sabina Neschke et al., Gonzalez v. Google: Implications for the Internet’s Future, Bipartisan Policy Center (Nov. 29, 2022),

[19] Id.

[20] Emily Birnbaum, Google Case at Supreme Court Risks Upending the Internet We Know, Bloomberg Law (Feb. 16, 2023, 8:00 AM),