Platform Liability Law: Holding Companies Accountable
One of the biggest legal challenges in suing online platforms for child exploitation is Section 230 of the Communications Decency Act, which provides broad immunity to internet companies. However, this shield is not absolute, and recent legal developments have created pathways for holding platforms accountable when their design choices and business decisions enable harm to children.
Understanding Section 230
Section 230 of the Communications Decency Act, passed in 1996, is one of the most important and controversial laws governing the internet. It provides online platforms with immunity from liability for content posted by their users.[1]
The law's key provision states that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. In practical terms, this means that if a user posts something harmful on a platform, the platform generally cannot be sued for that content.[2]
Why Section 230 Was Created
Section 230 was enacted in response to a 1995 case called Stratton Oakmont v. Prodigy. In that case, a court held that Prodigy could be liable for defamatory content posted by users because Prodigy exercised some editorial control by moderating content. This created a perverse incentive for platforms to do no moderation at all to avoid liability.[3]
Representatives Chris Cox and Ron Wyden introduced Section 230 as part of the Internet Freedom and Family Empowerment Act. Their goal was to encourage platforms to moderate harmful content without fear of becoming liable for everything users posted. The law was also intended to empower parents to restrict their children's access to inappropriate material online.[4]
Congress originally enacted the statute to nurture a nascent industry while incentivizing online platforms to remove content harmful to children. The combination of significant technological changes since 1996 and the expansive interpretation courts have given Section 230, however, has left online platforms both immune for a wide array of illicit activity on their services and free to moderate content with little transparency or accountability.[5]
How Section 230 Has Been Applied
For more than 25 years, Section 230 has provided sweeping protection to online platforms. Courts have interpreted it broadly to dismiss lawsuits based on user-generated content, including cases involving defamation, harassment, and even child exploitation.[6]
Important court rulings on Section 230 have held that users and services cannot be sued for forwarding email, hosting online reviews, or sharing photos or videos that others find objectionable. It has helped quickly resolve lawsuits that have no legal basis, allowing platforms to avoid costly litigation.[7]
However, this broad immunity has had troubling consequences. A tragic example came in 2016 when it was revealed that classified ads website Backpage.com had been facilitating underage sex trafficking for years. Because of Section 230, victims and state prosecutors were unable to hold Backpage accountable for its role in enabling and profiting from these crimes. It took a two-year Senate investigation and federal criminal prosecution to finally shut Backpage down in 2018.[8]
Exceptions to Section 230 Immunity
While Section 230 provides broad immunity, it is not absolute. The law itself contains several important exceptions, and courts have recognized additional limitations.[9]
Federal Criminal Law Exception
Section 230 explicitly states that nothing in the law shall be construed to impair the enforcement of federal criminal law, including laws relating to obscenity or sexual exploitation of children. This means platforms can still face federal criminal prosecution for child sexual abuse material on their services.[10]
FOSTA Exception for Sex Trafficking
In response to the Backpage scandal, Congress passed the Fight Online Sex Trafficking Act (FOSTA) in 2018. FOSTA created an exception to Section 230 for content related to sex trafficking.[11]
FOSTA permits plaintiffs to bring claims under a beneficiary theory, meaning companies can be liable for a victim's harms if they benefited from participating in a venture with sex traffickers. In one case, Doe v. Twitter, Twitter was found to have participated in a venture with traffickers because it allegedly failed to disable an account when informed that the account posted child pornography.[12]
Intellectual Property Exception
Section 230 does not apply to lawsuits alleging violations of federal intellectual property law. This exception has been less relevant to child safety cases but demonstrates that Section 230 immunity has limits.[13]
State Laws Consistent with Section 230
Section 230 does not prevent states from enforcing state laws that are consistent with Section 230's provisions. This has allowed some state laws targeting platforms to proceed.[14]
Recent Challenges: Cracks in the Armor
In recent years, courts and advocates have identified important limitations to Section 230 that create pathways for holding platforms accountable, particularly in child safety cases.[15]
Product Liability and Design Defect Claims
One of the most significant recent developments is the recognition that Section 230 may not protect platforms from claims based on their own product design decisions rather than user-generated content.[16]
The landmark case is Lemmon v. Snap, Inc. decided by the Ninth Circuit in 2021. Parents sued Snapchat after their teens died in a high-speed car crash. Before the crash, the boys had used Snapchat's Speed Filter feature, which overlays your current speed on a post. Plaintiffs argued the feature incentivized reckless driving for social media credibility.[17]
Snapchat invoked Section 230, but the Ninth Circuit allowed the case to proceed. The court found the claim was about Snap's own product design, not the publishing of third-party content. The duty not to design a product that actively encourages dangerous behavior is independent of any duty to monitor or remove user content.[18]
This ruling created a critical opening for child safety cases. Attorney Matthew Bergman, who has become one of the leading lawyers suing social media companies on behalf of children, has built his strategy around this product liability theory.[19]
It was long thought that Section 230 was like an immovable beast. What we have seen is a chink in that armor, according to Professor Danielle Citron of the University of Virginia Law School.[20]
Algorithm and Recommendation Liability
Another promising avenue for platform accountability involves claims that a platform's recommendation algorithms or features caused harm, rather than merely hosting harmful content.[21]
New Mexico Attorney General sued Snap Inc., owner of Snapchat, alleging that the platform's recommendation algorithm enabled child sexual exploitation. The theory is that the algorithm's design choices represent the platform's own conduct, not merely hosting third-party content.[22]
As long as the platform itself does not create or co-develop the content, it has been free to monetize and algorithmically amplify it without fear of liability. However, some courts are now questioning whether recommendation algorithms that promote harmful content should receive Section 230 protection.[23]
Failure to Implement Safety Features
Several cases have argued that platforms can be liable for failing to implement basic safety measures to protect children, even if the ultimate harm came from user-generated content.[24]
The argument is that the failure to implement age verification, monitoring systems, or other safety features represents the platform's own negligent conduct. This is distinct from liability for hosting user content.[25]
State Attorney General Lawsuits
A series of recent rulings have signaled that courts may no longer allow Section 230 to be used as a catch-all liability protection for social media platforms. The more recent use of novel legal strategies states attorneys general are adopting to hold companies accountable for their conduct, including their design decisions, appears to be the driving factor in clearing these legal hurdles.[26]
Massachusetts Attorney General sued Meta using a public nuisance legal strategy. The suit accuses Meta of creating a public nuisance by designing Instagram to addict and exploit children's mental wellbeing and falsely representing these features to the public.[27]
A Superior Court judge rejected Meta's assertion that the harms were associated with third parties who acted independently from Meta. The complaint is primarily based on Meta's own conduct, not third-party content, according to the court opinion, denying Meta Section 230 immunity and allowing the public nuisance claim to proceed.[28]
Indiana successfully appealed a lower court dismissal of its lawsuit against TikTok. The appeals court held that TikTok's business model of accessing content in exchange for users' personal data qualifies as a consumer transaction under Indiana's Deceptive Consumer Sales Act, which the suit accuses TikTok of violating.[29]
A coalition of 14 state attorneys general, led by California and New York, filed suit against TikTok alleging the platform misleads the public about the safety of its platform and harms young people's mental health.[30]
Promissory Estoppel and Breaking Platform Promises
In limited circumstances, platforms may be liable for breaking specific promises they made to users about safety or content moderation. The key case is Barnes v. Yahoo, where Yahoo promised to remove a harmful profile and then failed to do so.[31]
However, courts have been cautious about expanding this exception. A concern is that if any general safety statement in a policy could create liability when the platform fails to remove some harmful content, it would significantly erode Section 230 immunity.[32]
Strategies for Overcoming Section 230
Experienced attorneys representing child exploitation victims have developed sophisticated strategies for bringing claims that fall outside Section 230's protection.[33]
Focus on Platform Conduct, Not User Content
The key to overcoming Section 230 is framing claims to focus on the platform's own conduct rather than merely hosting user content. This includes product design choices that create dangers, algorithmic recommendations that promote harmful content, failure to implement industry-standard safety measures, deceptive marketing about safety features, and features that incentivize or facilitate illegal behavior.[34]
Product Liability Theories
Product liability law provides a powerful framework for platform accountability. The argument is that platforms are selling a product (their app or service) and that product can be defective in ways that cause harm.[35]
Product liability was designed to cover physical products that cause personal injury, like Coke bottles that explode. However, courts are increasingly recognizing that software products can also be defective. The duty not to design a product that actively encourages dangerous behavior is independent of any duty to monitor or remove user content.[36]
Knowing Participation in Illegal Activity
Courts have held that Section 230 does not protect platforms that knowingly participate in illegal activity. This requires showing the platform had actual knowledge of specific illegal conduct and took affirmative steps to facilitate it.[37]
In Doe v. Salesforce, a court allowed claims against Salesforce for allegedly providing services to Backpage despite knowing the site was being used for sex trafficking. The claims centered on Salesforce's own conduct in facilitating a criminal enterprise, which falls outside Section 230's scope.[38]
State Consumer Protection Laws
Some state consumer protection laws have successfully circumvented Section 230 by focusing on deceptive business practices rather than content moderation. California's AB 1394 requires social media platforms to prevent commercial sexual exploitation of minors and allows victims to sue platforms that facilitated abuse, with damages ranging from $1 million to $4 million per violation.[39]
FOSTA Claims for Sex Trafficking
For cases involving sex trafficking or commercial sexual exploitation, FOSTA provides an explicit exception to Section 230. Plaintiffs can bring claims if the platform knowingly benefited from participating in a venture that involved sex trafficking.[40]
Department of Justice Proposed Reforms
The US Department of Justice conducted a comprehensive review of Section 230 and proposed several reforms to realign the law with the realities of the modern internet.[41]
Bad Samaritan Carve-Out
The DOJ proposed denying Section 230 immunity to platforms that purposefully facilitate or solicit third-party content or activity that would violate federal criminal law. The title of Section 230's immunity provision, Protection for Good Samaritan Blocking and Screening of Offensive Material, makes clear that immunity is meant to incentivize and protect responsible online platforms.[42]
Carve-Outs for Child Abuse, Terrorism, and Cyber-Stalking
The DOJ proposed exempting from immunity specific categories of claims that address particularly egregious content, including child exploitation and sexual abuse, terrorism, and cyber-stalking. These targeted carve-outs would halt the over-expansion of Section 230 immunity and enable victims to seek civil redress.[43]
Notice-Based Liability for Federal Crimes
The DOJ proposed that Section 230 immunity should not apply when a platform has notice or actual knowledge that content violates federal criminal law and fails to act. When it comes to unlawful content related to federal crimes like child exploitation, it is far less clear that we should be concerned about chilling such activity, and instead should be more concerned about halting such dangerous behavior.[44]
Pending Congressional Reform Proposals
Multiple bills have been introduced in Congress to reform or limit Section 230, though none have been enacted as of this writing.[45]
EARN IT Act
The EARN IT Act targets online child sexual abuse material. It proposes to condition Section 230 immunity on platforms following best practices to curb child sexual abuse material, effectively removing protections if they do not earn it through compliance. Civil liberties groups have opposed this due to concerns it would undermine encryption and privacy.[46]
Kids Online Safety Act
The Kids Online Safety Act, while not directly amending Section 230, would impose a duty on platforms to prevent and mitigate harm to minors, such as promoting self-harm or eating disorder content. This would create affirmative obligations for platforms regarding child safety.[47]
SAFE TECH Act
The SAFE TECH Act aims to reaffirm victims' rights by making it easier to sue platforms when their services are misused in certain ways. The bill has been introduced multiple times but not yet enacted.[48]
Current State of Platform Accountability
The landscape of platform liability is rapidly evolving. What was once considered an impenetrable legal shield is showing significant cracks.[49]
Roughly 1,500 social media cases brought by attorney Matthew Bergman's firm have been allowed to proceed past initial motions to dismiss. This represents a significant shift from just a few years ago when nearly all such cases were dismissed on Section 230 grounds.[50]
The conversation is slowly starting to shift as policymakers and state attorneys general focus on problematic features of social media platforms being design-based and thus content-agnostic. That could eventually change the narrative about Section 230 protections, according to legal experts.[51]
The Supreme Court has shown interest in addressing Section 230 but has not yet issued a definitive ruling that significantly limits or expands its scope. In Gonzalez v. Google (2023), the Court vacated a Ninth Circuit decision but did not establish broad new precedent, suggesting the justices want to address Section 230 but felt the facts of that particular case were not the right vehicle.[52]
What This Means for Roblox Cases
Cases against Roblox for child exploitation will need to overcome Section 230 challenges. However, several promising legal theories exist.[53]
Product Liability Claims
Claims can focus on Roblox's design choices that allegedly make the platform particularly conducive to grooming and exploitation. This includes the chat system that allows direct communication, the lack of effective age verification, features that enable anonymity and predator access to children, inadequate moderation systems, and algorithmic matching that connects adults with children.[54]
These are allegations about Roblox's own product design decisions, not merely about hosting user-generated content. Following Lemmon v. Snap, such claims may proceed outside Section 230's protection.[55]
Negligence in Safety Measures
Claims can allege Roblox was negligent in failing to implement industry-standard child safety measures. This focuses on Roblox's conduct in operating the platform rather than on specific user content.[56]
Deceptive Marketing
If Roblox has made representations about its safety features that are misleading or false, claims for deceptive business practices or fraud may not be barred by Section 230. These claims focus on Roblox's own statements, not third-party content.[57]
FOSTA Claims Where Applicable
In cases involving sex trafficking or commercial sexual exploitation, FOSTA's exception to Section 230 may apply. This requires showing Roblox knowingly benefited from participating in a trafficking venture.[58]
Working with Experienced Counsel
Successfully navigating Section 230 challenges requires sophisticated legal expertise. Attorneys experienced in platform liability cases understand how to frame claims to fall outside Section 230's protection while still addressing the harm caused to children.[59]
These cases involve complex legal questions at the intersection of technology, tort law, and evolving internet regulation. The legal landscape is changing rapidly as courts grapple with how decades-old laws apply to modern platforms.[60]
Not everyone is convinced of the legal merits of these approaches. Some legal scholars argue that product liability was designed for physical products and that online harms are impossible for digital platforms to prevent. If we establish liability for those kinds of harms, the risks may be legally unmanageable, according to critics.[61]
However, advocates for platform accountability argue that companies should be responsible for foreseeable harms caused by their design choices. When platforms design features that actively facilitate predators' access to children, they should be held accountable regardless of Section 230.[62]
The Path Forward
The debate over platform liability and Section 230 reflects larger questions about internet governance and corporate responsibility. As more evidence emerges about how platforms enable harm to children, pressure is mounting for legal and legislative solutions.[63]
Reform is important now more than ever. Every year, more citizens, including young children, are relying on the internet for everyday activities, while online criminal activity continues to grow. We must ensure that the internet is both an open and safe space for our society.[64]
For families whose children have been harmed through platforms like Roblox, the evolving legal landscape provides hope that accountability is possible. While Section 230 remains a significant challenge, it is no longer the impenetrable barrier it once seemed.[65]
Questions About Platform Liability?
If your child was exploited through Roblox, speak with attorneys who understand how to overcome Section 230 challenges and hold platforms accountable for design decisions that enable child predators.