A Supreme Court case could kill Facebook and other social media — so blockchain could replace them

The Internet – arguably the greatest invention in human history – has gone awry. We can all feel it. It’s harder than ever to tell if we’re dealing with friends or foes (or bots), we know we’re constantly being monitored in the name of better ad conversion, and we live in constant fear of clicking something and being scammed .

The failures of the internet are largely due to the inability of major technology monopolies – especially Google and Facebook – to verify and protect our identities. Why don’t they?

The answer is that they have no incentive to do so. In fact, the status quo suits them, thanks to Section 230 of the Communications Decency Act, passed by the United States Congress in 1996.

Related: Nodes are going to dethrone tech giants – from Apple to Google

But things may be about to change. This term will be heard by the Supreme Court Gonzalez v. Google, a case that has the potential to reshape or even eliminate Section 230. It’s hard to envision a scenario where it wouldn’t kill the social media platforms we use today. That would provide a golden opportunity for blockchain technology to replace them.

How did we get here?

A key enabler of the internet’s early development, Section 230 states that online platforms are not legally responsible for content posted by their users. As a result, social media sites like Facebook and Twitter are free to publish (and profit from) anything their users post.

The plaintiff in the case now in court believes internet platforms are responsible for the death of his daughter, who was killed by Islamic State-linked attackers in a Paris restaurant in 2015. He believes algorithms developed by YouTube and its parent company Google “recommended ISIS videos to users », thus driving the terrorist organisation’s recruitment and ultimately facilitating the Paris attack.

Section 230 gives YouTube a lot of coverage. If defamatory, or in the case above, violent content is posted by a user, the platform may show this content to many consumers before any action is taken. In the process of determining whether the content violates the law or the platform’s terms, a lot of damage can be done. But Section 230 shields the platform.

Related: Crypto breaks the Google-Amazon-Apple monopoly on user data

Imagine a YouTube after Section 230 has been struck down. Does it have to put the 500 hours of content uploaded every minute into a review queue before other people are allowed to see it? It would not scale and would remove much of the attractive immediacy of the site’s content. Or would they just allow the content to be published as is, but take legal responsibility for every copyright violation, incitement to violence, or defamatory word uttered in one of the billions of videos?

When you pull the Section 230 thread, platforms like YouTube start to unravel quickly.

Global Implications for the Future of Social Media

The case is focused on an American law, but the issues it raises are global. Other countries are also struggling with how best to regulate internet platforms, especially social media. France recently ordered manufacturers to install easily accessible parental controls on all computers and devices and banned the collection of minors’ data for commercial purposes. In the UK, Instagram’s algorithm was officially found to be a contributor to the suicide of a teenage girl.

Then there are the world’s authoritarian regimes, whose governments intensify censorship and manipulation efforts by harnessing armies of trolls and bots to sow disinformation and mistrust. The lack of a workable form of ID verification for the vast majority of social media accounts makes this situation not only possible, but inevitable.

And the beneficiaries of a non-Section 230 economy may not be who you expect. Many more individuals will file lawsuits against the major technology platforms. In a world where social media can be held legally responsible for content posted on their platforms, armies of editors and content moderators must be assembled to review every image or word posted on their sites. Considering the volume of content that has been posted on social media over the past few decades, the task seems almost impossible and is likely to be a victory for traditional media organizations.

Looking a little further, Section 230’s demise will completely destroy the business models that have fueled the growth of social media. Platforms would suddenly be responsible for an almost limitless supply of user-generated content, while increasingly strong privacy laws strain their ability to collect vast amounts of user data. It will require a total reconstruction of the concept of social media.

Many people misunderstand platforms like Twitter and Facebook. They believe the software they use to log into these platforms, post content, and view content from their network is the product. It is not. The moderation is the product. And if the Supreme Court overturns Section 230, it completely changes the products we think of as social media.

This is a huge opportunity.

In 1996, the internet consisted of a relatively small number of static websites and message boards. It was impossible to predict that the growth would one day make people question the very concepts of freedom and security.

People have fundamental rights in their digital activities as much as in their physical ones – including privacy. At the same time, the public demands some mechanism to sort facts from misinformation, and honest people from fraudsters, in the public domain. Today’s internet meets none of these needs.

Some argue, either openly or implicitly, that a healthier and healthier digital future requires hard trade-offs between privacy and security. But if we are ambitious and deliberate in our efforts, we can achieve both.

Related: Facebook and Twitter will soon be obsolete thanks to blockchain technology

Blockchains make it possible to protect and prove our identity at the same time. Zero-knowledge technology means we can verify information – such as age or professional qualification – without revealing any accompanying data. Soulbound Tokens (SBTs), Decentralized Identifiers (DIDs) and some forms of non-fungible tokens (NFTs) will soon enable a person to port a single, cryptographically provable identity across any digital platform, present or future.

This is good for all of us, whether in work, private or family life. Schools and social media will be safer places, adult content can be reliably age-restricted, and deliberate misinformation will be easier to track.

The end of Section 230 would be an earthquake. But if we take a constructive approach, it could also be a golden opportunity to improve the internet we know and love. With our identities established and cryptographically proven on the chain, we can better prove who we are, where we stand and who we can trust.

Nick Daze is the co-founder and CEO of Heirloom, a company dedicated to providing code-free tools that help brands create safe environments for their customers online through blockchain technology. Dazé also co-founded PocketList and was an early team member at Faraday Future ($FFIE), Fullscreen (acquired by AT&T) and Bit Kitchen (acquired by Medium).

This article is for general information purposes and is not intended to be and should not be taken as legal or investment advice. The views, thoughts and opinions expressed herein are those of the author alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

You may also like...

Leave a Reply

Your email address will not be published. Required fields are marked *