How a TikTok lawsuit may change social media for good

Modern social media platforms no longer simply provide access to content; they actively curate what users see using complex algorithms. These algorithms, like TikTok’s For You Page or Facebook’s feed, are designed to learn users’ behaviour and deliver content that would get a reaction

featured-image

Imagine a world where social media platform are held responsible for the content that their users post, and are in fact prosecuted for it. In such a world, most social media platforms would clamp down hard on some of the most virulent and vile aspects of the social media experience. For more than two decades, social media platforms like X (formerly Twitter), Facebook, Instagram, and TikTok have operated under legal protections, particularly those drafted in the US that date back to the mid-1990s.

These protections, rooted in the US’ Communications Decency Act of 1996, were originally designed to shield online services like AOL and CompuServe from liability for the content posted by third parties. The idea was that these platforms were mere conduits for information, not responsible for the content created by users. In the early days of the internet platforms like AOL functioned much like digital warehouses, storing and organizing vast amounts of information for users to access.



The platforms were largely passive, providing access to content without actively influencing what users saw. This was an era when the internet was still in its infancy, and the platforms were not yet equipped with the sophisticated algorithms that define today’s social media landscape. Fast forward to the present, and the situation has changed dramatically.

Modern social media platforms no longer simply provide access to content; they actively curate what users see using complex algorithms. These algorithms, like TikTok’s For You Page or Facebook’s feed, are designed to learn users’ preferences and deliver content tailored to their interests. The goal is to keep users engaged for as long as possible, which is crucial for these platforms, as they rely on advertising revenue rather than subscription fees.

The shift from passive content delivery to active curation has blurred the lines of responsibility. While platforms like AOL could reasonably claim that they had no control over the content users saw, today’s social media platforms play an active role in shaping the user experience. This has raised questions about the extent to which these platforms should be held accountable for the content their algorithms promote.

One case that could set a significant precedent is the lawsuit brought by the family of Nylah Anderson, a 10-year-old girl who tragically died after attempting a dangerous challenge she found on TikTok. The challenge, known as the “Blackout Challenge,” involved choking oneself until losing consciousness. Nylah’s family has been suing TikTok, arguing that the platform’s algorithm promoted the harmful content.

While lower courts initially dismissed the case, a US Court of Appeals recently ruled that the family could proceed with the lawsuit, stating that TikTok’s algorithm could be considered a form of speech, making the platform potentially liable. This ruling could have far-reaching implications for all social media platforms that use algorithms to curate content. If Nylah’s family ultimately wins the case, it could pave the way for more lawsuits against platforms like TikTok, X, Facebook, and Instagram, holding them accountable for the content their algorithms promote.

This would mark a significant shift from the protections these platforms have enjoyed under Section 230 of the Communications Decency Act. The outcome of this case could lead to a major overhaul of how social media platforms operate. Companies may be forced to redesign their algorithms to avoid promoting harmful content, or face the risk of costly lawsuits.

The stakes are high, and the decisions made in this case could shape the future of social media as we know it. As the legal landscape shifts, the responsibility of social media platforms to protect their users from dangerous content is becoming increasingly clear. The days of relying solely on legal protections from the 1990s may be coming to an end, and platforms may need to adapt quickly to avoid facing the consequences of their algorithms’ actions.

.