Here’s How to Reform Social Media

Image by Solen Feyiss

In 1996, at the dawn of the Internet, Congress unwisely decided to provide this fledgling technology with immunity from civil liability for what was posted on their sites.

 

That’s morphed into letting social media giants like TikTok and Meta off the hook for publishing child and revenge pornography, false advertising, true threats, terrorist recruitment videos, and other content that badly hurts people.  If a newspaper or TV station did this, they could be successfully sued, but the social media companies are immune.

 

Consider just one example of the devastating facts that are so commonly alleged in these cases. In May 2017, a young woman living in Bristol, Connecticut, began a romantic relationship that lasted five months. During their relationship, the couple exchanged intimate photographs. When the young woman decided to break off the relationship, her ex allegedly uploaded five of her nude photographs on the internet. These photographs eventually appeared on the website Tumblr alongside the woman’s name, personal information about her, and links to her Facebook and LinkedIn accounts. She reported the photographs to Tumblr at least seven times before finally filing a suit against Tumblr in federal court. Citing several cases dismissing similar claims brought against internet service providers, the court dismissed her complaint, noting that “Section 230 immunity applies even after notice of the potentially unlawful nature of the third-party content.”

 

There is simply no excuse for a website to continue hosting unlawful content like this after receiving notice to take it down. When content is unlawful, the victim reports the unlawful content to the social media platform, and the company still refuses to remove it from the platform; that company should be subject to civil liability.

 

The social media companies recognize that the status quo is unacceptable.  When confronted about TikTok’s failure to keep violative content off its platform, Adam Presser, TikTok’s head of operations, admitted, “Obviously…there’s truth to that.” Mark Zuckerberg, the CEO of Facebook, testified that “instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it.”  Clearly, the time has come to revisit immunity.

Social media companies argue that they receive so much content from users that they cannot effectively prescreen it all.  They have a point.  But this can be addressed by narrowly amending the current immunity to permit civil accountability for social media companies and other internet service providers if the provider is notified of unlawful content on its platform by the victim of such content and fails to remove it within 48 hours.

 

Right now, the United States has a similar law in relation to copyright violations, which extends liability to internet service providers who receive notice of copyright infringement and fail to remove the infringing material.

 

This modification would also bring the United States’ treatment of unlawful internet content in line with other advanced countries. The United Kingdom and New Zealand have adopted notice and–takedown regimes that apply to online defamation, incitement, and other unlawful content.  The European Union, Australia, and Japan impose liability if a social media company has “actual knowledge” of unlawful content and fails to remove it, regardless of whether the company has received a takedown notice.  None of these countries afford the same blanket immunity to internet service providers as the United States.  Yet, social media companies thrive in all these nations despite civil accountability.

 

This proposal strikes the appropriate balance between free expression and responsible moderation.  It would not require social media companies to prescreen content, and liability could only attach once (1) the unlawful content is posted, (2) the victim notifies the provider of the online conduct, and (3) the 48-hour grace period lapses.  It would simply do away with the unjust super immunity that has prevented victims from holding social media companies accountable for their knowing failure to remove unlawful content from their platforms.

Such civil liability would very likely cause social media companies to take down unlawful content after being given notice. Harm to victims of online abuse would be reduced. That would be very good for the people of the United States.

 


 

Shanin Specter (left) is a founding partner of Kline & Specter and law professor. [email protected].

Alex Van Dyke (right) is an associate at Kline & Specter. [email protected]

We welcome for consideration all submissions that adhere to three rules: nothing defamatory, no snark, and no talking points. It’s perfectly acceptable if your view leans Left or Right, just not predictably so. Come write for us.

Share With Your Connections
Share With Your Connections
More Exclusive Content
The Latest News from Smerconish.com in Your Inbox

Join our community of over 100k independent minds

This field is for validation purposes and should be left unchanged.

We will NEVER SELL YOUR DATA. By submitting this form, you are consenting to receive marketing emails from: Smerconish.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Aweber

The Latest News from Smerconish.com in Your Inbox

Join our community of over 100k independent minds

This field is for validation purposes and should be left unchanged.

We will NEVER SELL YOUR DATA. By submitting this form, you are consenting to receive marketing emails from: Smerconish.com. You can revoke your consent to receive emails at any time by using the SafeUnsubscribe® link, found at the bottom of every email. Emails are serviced by Aweber

Write for Smerconish.com

Thank you for your interest in contributing to Smerconish.com Please note that we are currently not accepting submissions for Exclusive Content; we appreciate your understanding.