What the End of Parler Means for Media Companies

Ready your content moderators because 2.3 million active users many of whom are eager to encourage violence, racism, antisemitism, antifeminism and conspiracy theories have lost their home base on Parler. 

As a social platform that encourages free speech with practically no moderation or fact-checking, Parler has gained a massive user base of people with radical views.

At least that was until Apple and Google booted Parler from their app stores in response to how it was used to organize the January 6th attack on Capitol Hill. Even Amazon Web Services (AWS), which hosted Parler, has abandoned the company, pushing the platform mostly offline. 

“We’ve seen a steady increase in this violent content on [Parler’s] website, all of which violates our terms,” reads a letter that AWS sent to Parler’s chief policy officer. “It’s clear that Parler does not have an effective process to comply with the AWS terms of service.”

Though a bare-bones version of Parler has recently popped up on a Russian-hosted site, the platform will likely continue to be banned from app stores on mobile devices, which account for most of their users.

With Parler practically scrubbed from the internet, its extreme users will be searching for other media platforms they can use to amplify their radical perspectives. Digital media companies and online publishers will need to prepare for a possible frenzy of visitors with loud, destructive voices, who believe content moderation is a threat to free speech.

Leaving your digital properties vulnerable to these toxic commenters can scare away your loyal community members and damage positive conversations. 

Instead, here’s what you can do to prevent ex-Parler users, or any other radical and offensive voices, from wreaking havoc on your digital social spaces:

Make Sure You Have Clear, Easy-to-Access Community Guidelines

Sometimes we have a concept of what is or isn’t allowed in comment content. But creating a clear, unassailable description in your community guidelines can help prevent initial violations and give your moderators a reference point that clearly defines unacceptable content.
Examples of content to explicitly define as unacceptable include:

  • Personal attacks
  • Vulgar or obscene content
  • Libelous or defamatory statements
  • Anything that can be described as threatening, abusive, pornographic, profane, indecent or otherwise objectionable

Be sure to post your guidelines in a visible area of your website so that your digital visitors can access them with ease.

On-Site Engagement Actions

Not all registered users offer the same amount of value to media organizations. 

“Some users register to a website in order to use social tools, and others may register just to access content,” Liang explains. “Those who register to participate in a conversation — whether that be through comments, likes, replies or dislikes — contribute to a media company’s community with meaningful interactions.”

Have an Escalation Plan

In the case of an emergency — like the threat of an active shooter at your headquarters — your team must have a clear procedure in place. There are a few crucial questions you can ask your team to help them prepare for these types of threats: 

  • Is there a clear chain of command in an emergency? 
  • When do you alert the police versus the organization you’re protecting?

Distinguish between different types of non-urgent, semi-urgent, general and specific threats and outline how moderators should react to each of them.

Update Your Banned Word List/Moderation Algorithm

Did you know that users within a community can develop new phrases to spread offensive and dangerous messages?

This was the case for one publisher when Viafoura’s moderators noticed that trolls were posting a recurring phrase in community social spaces: “BOB crime.” Our moderators quickly realized that this phrase was being used in offensive contexts, and after investigating, found out that it stood for “Black-on-Black crime,” which challenges the Black Lives Matter movement.

The moderation algorithm was quickly adjusted to prevent relevant comments from being posted within that publisher’s community. However, this is just a single example of many where new phrases are created within a community to maneuver around basic moderation systems.

The bottom line is that language evolves. 

To reinforce community standards successfully, it’s essential that moderation algorithms and ban word lists are updated quickly as new, offensive language is discovered. 

Be Prepared to Block IP Addresses

In the digital world, the general belief is that the more eyeballs a piece of content can get, the better. The end goal for media executives is typically to gain and engage more site visitors to maximize subscriptions. However, visitor quantity isn’t always better than quality.

“Don’t be afraid to ban users,” says Leigh Adams, director of moderation solutions at Viafoura. “A lot of newspapers are afraid to ban users because they want the audience, but when you allow trolls and other toxic users to take over, you’re actually scaring away more valuable visitors.”

Fewer quality commenters offer more value to brands than many commenters that destroy the safety and trust between an organization and its loyal followers.

Ultimately, you are in control of your online community.

Just remind users in your community guidelines that you reserve the right to remove or edit comments and permanently block any user in violation of your terms and conditions. This umbrella statement gives you complete control over the content your community produces, guaranteeing discourse will remain positive and productive.

At the moment, we are living in a time of unpredictable change and misinformation. Whether or not any of Parler’s users make their way onto your website or app, it’s important to be prepared to handle and discourage any toxic behavior. Maintaining positive and productive social spaces will help to strengthen engagement around your brand while protecting its reputation. 

Need help identifying and stopping trolls? Check out our guide written by our head of moderation services on troll hunting.

Panel Discussion Breakdown: The New Rules of Moderation

2020 was undoubtedly the intersection of major health, political and social justice-related events. These life-changing crises activated wave after wave of misinformation and trolls, which, as many media professionals found, can damage the quality of human conversation within digital communities. 

Recently, executives from Editor and Publisher, USA Today, Graham Media Group and Viafoura gathered together to address these concerns in a recent panel discussion on the new rules of moderation. 

“Moderation’s been a big topic this year,” says Mike Blinder, the publisher of Editor and Publisher Magazine. “People are spouting off at their dinner tables, they’re spouting off at their mobile phones and they’re obviously spouting off on [media] platforms but we need to go beyond this.”

Not only do media companies need to eliminate trolls and build trust with community members, but they also need to expand their loyal audiences and maximize revenue. 

Gear up with insights and best practices on content moderation from the panel discussion to keep your media organization’s social spaces brand-safe, productive and profitable.

Setting the Stage for Ideal, Productive Discussion

In a perfect world, digital social spaces would be filled with an endless stream of engaging comments coming from multiple voices. 

But not every comment is valuable, and not every commenting thread will thrive. At least not without the proper support.

It’s up to moderators to ensure that only positive comments are being surfaced. Meanwhile, media staff have the power to amplify engagement from community members. With an effective moderation system and community engagement strategy in place, media companies can begin building communities by facilitating on-topic, positive conversations. 

“Once you take care of moderation well, once you engage, once you create a space where people want to come and talk about whatever they want to talk about that’s where you’re doing your community-building,” explains Viafoura Director of Moderation Leigh Adams.

Community-Building Best Practices

Many media industry professionals still view commenting sections as spaces for toxic behavior and misinformation to take root. However, commenting spaces can be extremely useful community-building tools when managed properly.

“Just because you have a comment thread doesn’t mean you just have to hand it over to your audience and let them do whatever they want on it,” states Dustin Block, audience development lead at Graham Media. “You get to make decisions of what you’re going to allow people to share, particularly around your stories.”

Review some essential best practices for using social tools from the panel discussion to begin refining your community-building strategy:

  • Don’t moderate your own platform’s comments so you can free up time for your staff to focus on creating valuable interactions with visitors.
  • Tighten community guidelines to help audience members focus on producing brand-safe and on-topic conversations.
  • Leverage subject matter experts, including content producers, to answer questions and encourage positive discussion from audience members.
  • Anonymize names of commenters to prevent women, minority groups and people with unique names from getting harassed. 
  • Invite readers to participate in the content production process so they feel heard and valued.
  • Invest in building audiences on your owned and operated properties instead of social media, where you have little control over data, audience relationships and revenue. 
  • Correct misinformation on your digital properties whenever possible to position your brand as a trustworthy resource. 
  • Embed comment sections around content that is likely to lead to productive social exchanges. 
  • Elevate model behavior in the community by highlighting positive comments, rewarding top commenters with badges, and asking specific questions you’d like community members to answer.
  • Encourage participation in conversations by adding additional, exclusive story details in comment sections.

The Value of Moderated Comments

Comment spaces backed by an efficient moderation system can unleash multiple benefits for media companies.

“For every time someone posts a comment, you might have 50 people reading it… that’s where the value is,” Adams highlights. 

Civil discussions can entice visitors to stay on pages longer as they read the comments, which increases the likelihood that they’ll register to interact on your website or app. That includes watching or clicking on advertisements. 

The bottom line is that media companies can build stronger relationships with their visitors through moderated commenting tools, resulting in more behavioral data and increased revenue.  

According to Michelle Malatais, the managing editor of consumer news at USA Today, “if we make it a worthwhile experience, and we can we have to put staffing toward it and we have to put attention toward it then there’s value.”

For more information, you can access the complete recording of the discussion panel here.

5 Best Practices Media Companies Can Learn from European Publishers

This article first appeared in Publishing Executive

With more people living in Europe than in the U.S. and Canada combined, there’s a massive knowledge pool among European media companies that many North American publishers have yet to access. 

But the digital world stretches far beyond Earth’s physical borders. In fact, publishers in North America face many of the same challenges that European media companies deal with and, in some cases, have already overcome. 

So if you’re interested in maintaining a successful media business, we simply need to look to our fellow neighbors across the world for answers and inspiration. 

We’ve rounded up the top takeaways from extremely successful European publishers below — because a handful of European publishers are clearly doing something right. 

Establish a Relationship with Readers

While consumers can drive up your company’s revenue, they can also come and go without hesitation. Friends, on the other hand, tend to be eternally loyal as long as they’re engaged continuously. 

German news publisher Die Zeit has developed a program based on this sentiment that grows a base of loyal friends who actively support the brand. This program allows consumers to participate in live conversations with the company’s staff, and even suggest stories they’d like to see covered.

As a result, its audience is highly invested in the published content. 

“The loyalty of our subscribers is what makes our journalism here at Die Zeit possible,” Lennart Schneider, who runs their Friends of Die Zeit program, explains in an INMA webinar.

After all, building meaningful relationships with consumers is an effective way to make media companies stand out from competitors.

Understand What Kind of Content Converts

At the 2020 INMA Media Subscriptions Summit in New York, Norwegian news publisher Aftenposten reported that its subscription revenue has climbed by 80% since restructuring its business model. One of the major changes that contributed to this growth was the company’s approach to content. 

Aftenposten’s successful business strategy prioritizes the types of content that converts users, and locks it behind a paywall. To accomplish this, the publisher has minimized the barriers between its data experts and editorial team.

“The whole purpose is to democratise our data and give it to the journalists,” says Aftenposten’s brand manager. “For us, that’s been the key to driving change and to feel like everyone is working toward the same goals.”

Structure Your Paywall around Data

Aftenposten isn’t the only publisher whose paywall and data strategies are intertwined. Many European media companies use their first-party data to inform their paywall strategies so registration messages appear when audience members are most engaged. 

Take Sweden’s MittMedia, for instance. Based on its audience data, the publisher found that the majority of its page views occur 60 minutes after content has been published. 

As a result, the publisher has seen success by adjusting its paywall to only lock content after those first 60 minutes.

Automate Time-Consuming Newsroom Tasks

Newsrooms around Europe are rapidly adopting intelligent automation. For example, The Guardian created its own tool to create articles automatically and Schibsted has implemented an AI-based tool to improve content recommendations and personalize user experiences. 

In both cases, the publishers end up saving editorial resources.

Our time is limited, our resources are limited,” explains the editor-in-chief of the London Evening Standard in an INMA webinar. “I suggest you look at your audience and your core values and then look at some of the tools that are out there and try them.”

Explore New Ways to Engage Audiences

One French publisher, Le Monde, is growing at a steady pace of 14,000 new online subscribers each month. Most recently, the publisher has been testing investigative podcast series to expand breadth of reporting and boost their subscriptions.

“Podcasts are a way to connect with new audiences,” the deputy editor of Le Monde told Digiday. “For audiences who may not come by themselves to Le Monde, this can be a contribution to driving our long-term strategy of digital subscriptions.”

However, engaging new audiences doesn’t need to be limited to podcasting. The more opportunities you can give consumers to engage with your brand, the more likely they are to convert. In fact, there are a whole slew of engagement tools your media company can implement on its digital properties.

From community-building tactics to automated newsroom strategies, European publishers offer a whole range of insights that can be leveraged to improve your own company.

Exit mobile version