Content Moderation in Decentralized Platforms

With a decentralized platform, which is a system with no central computer, who does the moderating?

Content Moderation in Decentralized Platforms
Katie Metz // Adriana Lacy Consulting

When a platform runs on a single computer system, the company regulating the system can moderate the platform. However, with a decentralized platform — which is a system with no central computer — who does the moderating? Here, we'll look at content moderation in decentralized platforms and discuss why it matters.

What Is a Decentralized Platform?

Decentralized platforms run on a network of computers, unlike platforms that run on a single computer. These networks are called blockchains or peer-to-peer networks. Here are a few examples of both a centralized and decentralized network:

Centralized Platforms

Centralized platforms are companies like Twitter or Facebook. Although millions of people use both, the system stems from one network that controls the data, censorship, and all other aspects of the platform.

There are pros and cons to centralized systems. Centralized systems are constantly moderated and moderators have the authority to delete harmful content quickly. They're usually intuitive and easy to use, they have a wide range of features and a large user base that makes it easy to connect with other users.

However, the owner of the platform typically has most of the power, there is often minimal security for user data leading to a higher likelihood of cyberattacks and the platform owner has the right to delete your content without your permission.

Decentralized Platforms

A few examples of decentralized platforms are Mastodon, Blockchain, and Ethereum, which all lack centralized authority. The biggest benefit is that the users are in control. Your data isn’t sold to third parties, there's greater security for user information leading to fewers hacks and no once can delete someone else's content.

That's not to say decentralized systems are perfect. If no one can delete someone else's content, that also means there's little to no moderation or censoring on certain platforms and no authority to protect you from spam and harassment. Decentralized platforms also usually have fewer members, which can make it difficult to connect with others online.

What Is Content Moderation?

Content moderation is the process of ensuring content on a platform upholds the platform’s standards. In other words, content is monitored to see if any platform rules are broken. If they are on a centralized platform, the content would be removed. Here are the steps to content moderation:

Step 1: Create the rules of the platform.

Step 2: Decide what kind of content is allowed.

Step 3: Decide who is allowed to share content.

Step 4: Determine how content is submitted to allow time for the moderators to view it.

Step 5: Moderate content 24/7.

Content moderation protects users from content they don’t want to see, or content that could be harmful to them. Still, some find the act of moderating to be too intrusive and controlling.

How Decentralized Platforms Affect Content Moderation

With the understanding of the importance of content moderation, it begs the question: what about decentralized platforms? Is it like the Wild West with no content moderation? Is it possible to moderate a system that has no central network?

In a word: yes. Decentralized platforms can be moderated, and most are. The content rules are typically determined by the platform's community. Still, it is tough to enforce the rules with no set authority. Instead, tools are implemented to make content moderation possible.

Third-Party Tools

With the help of third-party tools, decentralized platforms can implement content moderation. However, to use a third-party application, the platform must first install an API (Application Programming Interface) to the admin interface.

Mastodon was one of the first to add an API so third-party tools could connect with server admins and build a moderating system to prevent spamming and harassment on the platform.  

Machine Learning and AI Tools

One possibility that is not yet widely used on decentralized platforms is machine learning or artificial intelligence tools. It is commonly seen with large, centralized platforms like Facebook, Twitter, and Instagram. Using AI, large amounts of content can be reviewed much quicker and more cost-efficiently than it could be by employed humans.

Relying on the Community

Another option, and the most common for decentralized moderation, is relying on the community to uphold community standards. With a decentralized platform, users should be told about the platform's expectations right from the beginning. Although one of the significant benefits of a decentralized platform is the freedom of expression and little to no censorship, the reality is that not having any authority in place can lead to unwanted harassment and spamming.

This is why it's important that a community of a decentralized platform be encouraged to be all-inclusive, kind, and band together when something needs to be flagged as harmful.

Content Moderation and Brands

With two types of platforms, both centralized and decentralized, which is better for the growth and protection of a brand? It depends on your goals. Let’s first look at brand awareness.

What Is Brand Awareness?

Brand awareness is building a brand so consumers become aware of it. It’s done through marketing, and part of the strategy often includes placing branded content across as many platforms as possible while initiating and responding to users. The goal is to build relationships and hopefully a respected reputation.

Centralized Platforms and Brand Awareness

Centralized platforms such as Facebook and Instagram have moderators in place to handle the content, flag and delete anything that looks harmful, which is helpful for brands that want to protect their reputation.

When a brand chooses a centralized platform, the understanding is that the moderators will, hopefully, protect the brand as much as they can. However, a downside is that the brand must also monitor its actions and not get flagged for inappropriate content. If it’s a risky brand, then a platform with moderators may not work well.

Decentralized Platforms and Brand Awareness

But what happens with a decentralized platform that has no content moderation? Is it safe for a brand? Negative comments can harm a brand and eventually destroy it if the followers and community no longer want to be a part of it.

The reality is that a platform without moderation is risky for brands. As a brand connects with thousands, sometimes millions of users, there is bound to be some harmful content when its left unchecked. This is not to say that a brand can’t use a decentralized platform; however, they should ensure they choose one that implements some moderating.

Still, if the brand is looking for negative attention, extreme freedom of speech, and a lack of censorship, a decentralized platform without moderating may work well.

A centralized platform uses one network to run, and that one network has authority over the platform. A decentralized platform uses several computers (networks) and is peer-run, with no specific authority. Both have pros and cons, but the decentralized platform allows more freedom without censorship.

Content moderating consists of reviewing and upholding content to a platform’s standards. Centralized platforms always have content moderation, while only some decentralized platforms do.

It’s up to you to decide which platform you are comfortable. However, when growing a brand, it’s critical to understand the platforms you use and whether or or not moderation is in its best interest.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Media Minds by Adriana Lacy Consulting.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.