Internet Firms Must Face Defective Product Liability Suit Over Buffalo Mass Shooting

By Andrew G. Simpson | March 22, 2024

A New York state judge has allowed a case seeking to hold social media companies liable in the tragic mass killing at a Buffalo supermarket in 2022 to go forward on grounds of product liability.

Erie County Supreme Court Judge Paula L. Feroleto denied the bid by Meta, Snap, Alphabet, Reddit, Amazon and other platforms to have the claims of the victims’ families dismissed because the firms are entitled to immunity as publishers under federal communications law, Section 230 of the Communications Decency Act (CDA).

Judge Feroleto concluded that it is premature to dismiss the plaintiffs’ causes of action against the internet defendants. She found that the families of the shooting victims successfully argued that the companies should face product liability claims based on the alleged “addictive and dangerously defective design” of their platforms. The families maintain that the social media platforms radicalized the shooter to commit white supremacist murders at the Buffalo supermarket.

The judge agreed with the families that the federal communications law and First Amendment relied upon by the social media firms do not apply in a product liability lawsuit.

“This historic ruling will for the first-time permit victims of racist, antisemitic, anti-immigrant, and homophobic violence to hold social media companies accountable for contributing to the epidemic of mass shootings that are plaguing our nation,” said Matthew P. Bergman, attorney of Social Media Victims Law Center, which is representing several of the families.

While this is first to link social media to shootings, other cases involving alleged harm done by social media have cited product liability arguments. Last November, a federal district judge in northern California allowed a lawsuit blaming social media firms including Meta, Google and TikTok for mental health harm to teenage users to proceed on product liability grounds. A federal appeals court in Philadelphia is also facing the question of whether Tik-Tok should be held liable under product liability for the death of a teenage girl who participated in a Blackout Challenge.

Buffalo Massacre

On May 14, 2022, 10 Black people were killed and three people were injured in the mass shooting at Tops Friendly Markets supermarket in Buffalo. The shooter, Payton Gendron, sought out a historically Black neighborhood and drove hundreds of miles from his home to commit his attack.

The Erie County judge wrote that it is “essentially undisputed” that the horrific acts perpetrated by Gendron were motivated by the concept of “white replacement theory.”

The complaint alleges that Gendron was motivated by white supremacism and radicalized by the algorithms driving the social media products that fed him increasingly racist, antisemitic, and violence-inducing content. According to comments from Gendron’s own criminal defense attorney, “The racist hate that motivated this crime was spread through on-line platforms….”

The complaint says Gendron admits he was “lured, unsuspectingly, into a psychological vortex by defective social media applications and fed a steady stream of racist and white supremacist propaganda and falsehoods.”

The social media firms argue that the white supremacist theory is third-party content and thus Section 230 and the First Amendment preclude the damages claims against them. Section 230 says that internet services are immune from liability for publishing material, so long as the information is provided by another party.

The plaintiffs contend that the platforms are more than just message boards containing third-party content. Instead, the plaintiffs’ lawyers maintain that the platforms are negligently, defectively and harmfully designed “products” that are meant to be addictive to young users and that drove Gendron to further platforms or postings that indoctrinated him with “white replacement theory.” The platforms are therefore liable based on product liability theories, the plaintiffs argue.

The families have asserted multiple causes of action against the internet companies including strict products liability for defective design and failure to warn; negligence, negligent failure to warn, invasion of privacy, negligent and intentional infliction of emotional distress and unjust enrichment. Not all of these causes of action are asserted against all of the social media/internet defendants.

Product Law

The judge noted that what constitutes a product under New York law is “not confined to tangible chattels” and that the state has “rejected a bright-line rule for the application of product liability law,” choosing instead to define a product within a “broader context of common law duty.”

The state’s highest court, the Court of Appeals, has emphasized three factors in determining whether an item is a product: (1) a defendant’s control over the design and standardization of the product, (2) the party responsible for placing the product into the stream of commerce and deriving a financial benefit, and (3) a party’s superior ability to know—and warn about—the dangers inherent in the product’s reasonably foreseeable uses or misuses.

The judge found that, contrary to the assertions of the defendants, the factual allegations as a whole of the complaint are sufficient to allege viable causes of action against each of the internet firms. The judge explained:

“Many of the social media/internet defendants have attempted to establish that their platforms are mere message boards and/or do not contain algorithms subjecting them to the protections of the CDA and/or First Amendment. This may ultimately prove true. In addition, some defendants may yet establish that their platforms are not products or that the negligent design features plaintiffs have alleged are not part of their platforms.”

However, the judge continued, at this stage of the litigation the court must base its ruling on the allegations of the complaint and not “facts” asserted by the defendants in their briefs or during oral argument.

Proximate Cause

One of internet defendants’ arguments is that the criminal actions of the shooter break the chain of causation between his use of their platforms and the ensuing shooting. The complaint sets forth in detail the development of the plan culminating in the shootings. The complaint notes that Gendron made public 700 pages of a diary that detailed his plans and what he termed a “manifesto” describing his racist motivations.

But the judge also declined to dismiss the suit for lack of proximate cause.

“At this juncture of the litigation it is far too early to rule as a matter of law that the actions, or inaction, of the social media/internet defendants through their platforms require dismissal on proximate cause. The facts alleged do not show ‘only one conclusion’ that could be made as to the connection between these defendants’ alleged negligence and the plaintiffs’ injuries,” the opinion states.

The social media defendants also argue that they did not owe the families a duty of care. But the judge countered: “It is long-established in New York products liability jurisprudence that a manufacturer of a defective product is liable to ‘any person. injured from the product.”

Gendron is serving a prison sentence of life without parole after pleading guilty to crimes including murder and domestic terrorism motivated by hate.

AP Photo: Police walk outside the Tops grocery store May 15, 2022, in Buffalo, N.Y. , site of the racist mass shooting. (AP Photo/Joshua Bessex, File)

Was this article valuable?

Here are more articles you may enjoy.