What the End of Parler Means for Media Companies

Ready your content moderators because 2.3 million active users many of whom are eager to encourage violence, racism, antisemitism, antifeminism and conspiracy theories have lost their home base on Parler. 

As a social platform that encourages free speech with practically no moderation or fact-checking, Parler has gained a massive user base of people with radical views.

At least that was until Apple and Google booted Parler from their app stores in response to how it was used to organize the January 6th attack on Capitol Hill. Even Amazon Web Services (AWS), which hosted Parler, has abandoned the company, pushing the platform mostly offline. 

“We’ve seen a steady increase in this violent content on [Parler’s] website, all of which violates our terms,” reads a letter that AWS sent to Parler’s chief policy officer. “It’s clear that Parler does not have an effective process to comply with the AWS terms of service.”

Though a bare-bones version of Parler has recently popped up on a Russian-hosted site, the platform will likely continue to be banned from app stores on mobile devices, which account for most of their users.

With Parler practically scrubbed from the internet, its extreme users will be searching for other media platforms they can use to amplify their radical perspectives. Digital media companies and online publishers will need to prepare for a possible frenzy of visitors with loud, destructive voices, who believe content moderation is a threat to free speech.

Leaving your digital properties vulnerable to these toxic commenters can scare away your loyal community members and damage positive conversations. 

Instead, here’s what you can do to prevent ex-Parler users, or any other radical and offensive voices, from wreaking havoc on your digital social spaces:

Make Sure You Have Clear, Easy-to-Access Community Guidelines

Sometimes we have a concept of what is or isn’t allowed in comment content. But creating a clear, unassailable description in your community guidelines can help prevent initial violations and give your moderators a reference point that clearly defines unacceptable content.
Examples of content to explicitly define as unacceptable include:

  • Personal attacks
  • Vulgar or obscene content
  • Libelous or defamatory statements
  • Anything that can be described as threatening, abusive, pornographic, profane, indecent or otherwise objectionable

Be sure to post your guidelines in a visible area of your website so that your digital visitors can access them with ease.

On-Site Engagement Actions

Not all registered users offer the same amount of value to media organizations. 

“Some users register to a website in order to use social tools, and others may register just to access content,” Liang explains. “Those who register to participate in a conversation — whether that be through comments, likes, replies or dislikes — contribute to a media company’s community with meaningful interactions.”

Have an Escalation Plan

In the case of an emergency — like the threat of an active shooter at your headquarters — your team must have a clear procedure in place. There are a few crucial questions you can ask your team to help them prepare for these types of threats: 

  • Is there a clear chain of command in an emergency? 
  • When do you alert the police versus the organization you’re protecting?

Distinguish between different types of non-urgent, semi-urgent, general and specific threats and outline how moderators should react to each of them.

Update Your Banned Word List/Moderation Algorithm

Did you know that users within a community can develop new phrases to spread offensive and dangerous messages?

This was the case for one publisher when Viafoura’s moderators noticed that trolls were posting a recurring phrase in community social spaces: “BOB crime.” Our moderators quickly realized that this phrase was being used in offensive contexts, and after investigating, found out that it stood for “Black-on-Black crime,” which challenges the Black Lives Matter movement.

The moderation algorithm was quickly adjusted to prevent relevant comments from being posted within that publisher’s community. However, this is just a single example of many where new phrases are created within a community to maneuver around basic moderation systems.

The bottom line is that language evolves. 

To reinforce community standards successfully, it’s essential that moderation algorithms and ban word lists are updated quickly as new, offensive language is discovered. 

Be Prepared to Block IP Addresses

In the digital world, the general belief is that the more eyeballs a piece of content can get, the better. The end goal for media executives is typically to gain and engage more site visitors to maximize subscriptions. However, visitor quantity isn’t always better than quality.

“Don’t be afraid to ban users,” says Leigh Adams, director of moderation solutions at Viafoura. “A lot of newspapers are afraid to ban users because they want the audience, but when you allow trolls and other toxic users to take over, you’re actually scaring away more valuable visitors.”

Fewer quality commenters offer more value to brands than many commenters that destroy the safety and trust between an organization and its loyal followers.

Ultimately, you are in control of your online community.

Just remind users in your community guidelines that you reserve the right to remove or edit comments and permanently block any user in violation of your terms and conditions. This umbrella statement gives you complete control over the content your community produces, guaranteeing discourse will remain positive and productive.

At the moment, we are living in a time of unpredictable change and misinformation. Whether or not any of Parler’s users make their way onto your website or app, it’s important to be prepared to handle and discourage any toxic behavior. Maintaining positive and productive social spaces will help to strengthen engagement around your brand while protecting its reputation. 

Need help identifying and stopping trolls? Check out our guide written by our head of moderation services on troll hunting.

Panel Discussion Breakdown: The New Rules of Moderation

2020 was undoubtedly the intersection of major health, political and social justice-related events. These life-changing crises activated wave after wave of misinformation and trolls, which, as many media professionals found, can damage the quality of human conversation within digital communities. 

Recently, executives from Editor and Publisher, USA Today, Graham Media Group and Viafoura gathered together to address these concerns in a recent panel discussion on the new rules of moderation. 

“Moderation’s been a big topic this year,” says Mike Blinder, the publisher of Editor and Publisher Magazine. “People are spouting off at their dinner tables, they’re spouting off at their mobile phones and they’re obviously spouting off on [media] platforms but we need to go beyond this.”

Not only do media companies need to eliminate trolls and build trust with community members, but they also need to expand their loyal audiences and maximize revenue. 

Gear up with insights and best practices on content moderation from the panel discussion to keep your media organization’s social spaces brand-safe, productive and profitable.

Setting the Stage for Ideal, Productive Discussion

In a perfect world, digital social spaces would be filled with an endless stream of engaging comments coming from multiple voices. 

But not every comment is valuable, and not every commenting thread will thrive. At least not without the proper support.

It’s up to moderators to ensure that only positive comments are being surfaced. Meanwhile, media staff have the power to amplify engagement from community members. With an effective moderation system and community engagement strategy in place, media companies can begin building communities by facilitating on-topic, positive conversations. 

“Once you take care of moderation well, once you engage, once you create a space where people want to come and talk about whatever they want to talk about that’s where you’re doing your community-building,” explains Viafoura Director of Moderation Leigh Adams.

Community-Building Best Practices

Many media industry professionals still view commenting sections as spaces for toxic behavior and misinformation to take root. However, commenting spaces can be extremely useful community-building tools when managed properly.

“Just because you have a comment thread doesn’t mean you just have to hand it over to your audience and let them do whatever they want on it,” states Dustin Block, audience development lead at Graham Media. “You get to make decisions of what you’re going to allow people to share, particularly around your stories.”

Review some essential best practices for using social tools from the panel discussion to begin refining your community-building strategy:

  • Don’t moderate your own platform’s comments so you can free up time for your staff to focus on creating valuable interactions with visitors.
  • Tighten community guidelines to help audience members focus on producing brand-safe and on-topic conversations.
  • Leverage subject matter experts, including content producers, to answer questions and encourage positive discussion from audience members.
  • Anonymize names of commenters to prevent women, minority groups and people with unique names from getting harassed. 
  • Invite readers to participate in the content production process so they feel heard and valued.
  • Invest in building audiences on your owned and operated properties instead of social media, where you have little control over data, audience relationships and revenue. 
  • Correct misinformation on your digital properties whenever possible to position your brand as a trustworthy resource. 
  • Embed comment sections around content that is likely to lead to productive social exchanges. 
  • Elevate model behavior in the community by highlighting positive comments, rewarding top commenters with badges, and asking specific questions you’d like community members to answer.
  • Encourage participation in conversations by adding additional, exclusive story details in comment sections.

The Value of Moderated Comments

Comment spaces backed by an efficient moderation system can unleash multiple benefits for media companies.

“For every time someone posts a comment, you might have 50 people reading it… that’s where the value is,” Adams highlights. 

Civil discussions can entice visitors to stay on pages longer as they read the comments, which increases the likelihood that they’ll register to interact on your website or app. That includes watching or clicking on advertisements. 

The bottom line is that media companies can build stronger relationships with their visitors through moderated commenting tools, resulting in more behavioral data and increased revenue.  

According to Michelle Malatais, the managing editor of consumer news at USA Today, “if we make it a worthwhile experience, and we can we have to put staffing toward it and we have to put attention toward it then there’s value.”

For more information, you can access the complete recording of the discussion panel here.

Exit mobile version