Does your moderation team have the tools to spot the troll?

As any moderator knows, trolling is more than just a nuisance.

According to Leigh Adams, director of moderation services at Viafoura, ‘trolls become the vocal minority, and can quickly overtake and drown out more relevant conversation.’ When toxic behavior goes unchecked, it can lead forums in a direction that actively harms your brand. Are your moderators equipped to effectively identify and eliminate bad behavior before it causes long-term damage? Here are five important tools they’ll need.

Community Guidelines

The first step to building a safe online community is to create strong community guidelines that are clear, easy to find, and comprehensive. These serve as the first line of defense for would-be trolls.

What should guidelines contain? Experts recommend a no-tolerance policy for personal attacks, obscene, libelous, or defamatory content, and anything abusive or profane. After that, guidelines should be customized to fit your platform.

When a user behaves badly, community guidelines can help your moderators make an informed decision to flag or ban them. They also provide justification, should a banned user file a complaint. This helps to protect your moderators, which in turn helps them do their jobs with confidence.

Know the Signs

Tolls can be sorted into one of two categories; Single account and multiple account. Both types can be identified quickly and easily if moderators know what to look for. Ensure your moderation team knows how to access user data and search for warning signs.

Single account trolls are users who cross the line, whether they know it or not. They can be identified by volume and typically boast a high number of flags, disables, or comments. They can be redeemed, and often amend behavior with a warning or a single ban.Multi-account trolls will return repeatedly under different aliases. They’re typically seeking attention and often leave hints about their identity in order to reignite toxic conversations. Moderators should look at new accounts for telltale signs of a returning troll. They will often have a high disable rate, similar name, avatar image, or IP address to a previously banned account. Some trolls are so eager for attention, they may even post ‘I’m back’ or ‘I was banned’.

 

Banning

When all warnings have failed to stop bad behavior, moderators should be empowered to ban problematic users. Bans help uphold your community guidelines and make a better space for the well-behaved users that wish to enjoy the platform.

There are several kinds of bans; incremental, ghost bans, and permanent bans. Moderators can increase their level of control by understanding the best use scenario for each type of ban.

An incremental ban is used as a stop-gap or a warning to make sure first-time offenders are given room for improvement. Ghost bans allow the troll to continue posting, but they remove the comment from the general feed. Permanent bans are just that: permanent. Bans are a powerful tool. Empower your moderators to use them. Provided they are using it accurately, the confidence to ban users will save time and mental fortitude.

AI Tools

One of the best tools you can give your human moderation team is an AI system that will identify the majority of obvious toxicity, reduce their moderation scope and allow them to focus on more nuanced situations.

There are a number of intelligent moderation options available, but not all systems are made equal. Many services use a ‘banned word’ list that won’t catch the majority of contextual issues or ‘masking’ violations. Instead, choose a service with natural language processing or machine learning. These systems allow the AI to adapt as moderators approve or block comments, customizing the algorithm to your platform.

According to the Viafoura report ‘Everyone Is A Troll’, communities with advanced moderation software have been proven to see growth; 62% more user likes, 35% more comments per user, and 34% more replies per user.

Support for your moderation team

Your moderation team does the essential but difficult job of guarding your community against incoming hostility. Creating an efficient, positive, and healthy work environment will help them avoid burnout and maintain positivity.

The first and most important aspect of a healthy workplace is open communication. Set up a channel (like Slack or Google Meets) and encourage your moderators to reach out for help. This will help your team remain neutral, identify each other’s unconscious bias and ensure knowledge can be shared.

Support your moderation team further by keeping the workload transparent and providing access to frequent breaks. Rest can be productive, and when you’re dealing with constant negativity, essential.

At Viafoura, we believe that well-prepared moderators make for strong healthy communities. How many of these tools do your moderators employ? Equip your team with a complete range of moderation strategies and they can build a community that feels safe, supports your brand, and grows along with your platform.

Warning Signs That Your Audience Isn’t Reaching Its Full Revenue-Generating Potential

With the proper engagement and retention strategies in place, your audience has the power to sustain your media company financially. 

“Consumer revenue streams, including digital subscriptions and ticketed live events, are increasingly important to news organizations as reliance on traditional advertising revenues continues to decrease,” explains Angelica Irizarry, the Inquirer’s director of events.

There’s a clear connection between active audiences and elevated revenue. However, not all media organizations are tapping into the full value of their digital communities.

Keep your eyes open for the following warning signs, which indicate that your media company may be losing out on potential revenue from its community.

High Churn Rates

Your existing subscribers are often more valuable than new subscribers. 

“It can cost five times more to attract a new customer than it does to retain an existing one,” reads an article on Forbes. 

In other words, having high churn in your digital community can translate to significant revenue loss for media companies. 

So if your company has an unusually high churn rate, you may want to consider reinforcing your retention strategy. 

Work with your engagement tool and paywall providers to identify unengaged community members that may be about to churn, and then send them unique offers and discounts to re-engage them. That way, you’ll maximize the revenue your existing customers are funneling into your organization.

High Amounts of Toxicity in the Comments

If you notice a climbing number of user-generated posts getting disabled by your moderation system, odds are, your digital social spaces are infested with trolls or spam.

Allowing offensive comments to overtake the meaningful, productive conversations within your digital properties can scare away advertisers and community members. And that means less revenue for your company in the long run.

“You can protect your social spaces by keeping your community guidelines up to date and ensuring your moderation system can properly enforce them,” says Leigh Adams, director of moderation services at Viafoura. “To discourage trolling behavior, moderators and editors can also highlight and reward positive behavior whenever possible.”

Your Most Engaged Audience Is Coming From Social Media

We’re well aware that social media holds millions of active users and advertising opportunities for brands. But this comes at the cost of precious first-party data, direct relationships with audience members, trust and, ultimately, complete revenue ownership. 

Having social media at the center of your community engagement strategy jeopardizes your company’s control over audience members and over any of the revenue earned from them.

Instead of giving a portion of your profits to big tech companies, there’s a better alternative for media companies: to invest in their own properties. 

That’s what led one media to build a brand-new podcasting app rather than relying on a third-party podcast streamer. 

“We needed it to be an experience that we control so we built apps to do this and the experience of doing the discovery, the experience of convenience, the experience of actually being in the Zetland universe when you listen to it, I think it’s quite important,” states Zetland CEO Tav Klitgaard. 

Companies can even replicate the social experiences offered by social media directly on their own properties with the help of audience development companies like Viafoura, to keep visitors on their sites and interacting for longer.

Engagement Spaces Aren’t Being Leveraged for Ads

Integrating audience engagement tools into your website or app can help your organization activate interest and conversation around your brand. However, some media companies don’t realize that there’s a significant amount of advertising revenue that can be earned from audiences through these tools.

By failing to run advertisements in these social spaces, you’re missing out on the opportunity to maximize engagement around your ads. Organizations can instead become more successful by advertising to highly engaged community members. 

“Today advertising value is measured by engagement so if publishers want to improve their ad offering, they must devise a successful engagement strategy first,” says Chris Waiting, CEO of The Conversation U.K. 

As you work to build, retain and monetize community members, keep your eyes open for warning signs that your company may be losing out on potential profit. Adjusting your business strategies accordingly will help you increase the revenue your organization can earn from its audience. 

Why People Hate Live Commenting, but Will Learn to Love It

A common misconception about online, live commenting tools has taken root within the internet. Want to take a guess as to what that might be? Let us give you a bit of hint:


Thanks to spammers, bots, trolls and toxic behavior, people now believe that commenting tools are destructive to a brand’s reputation. And with 63% of Americans convinced that incivility online results from social media, it’s no surprise that people think of all online social experiences in a negative way.

Here’s just a small taste of why people hate live commenting tools online:

In a Facebook post, one commenter responds to a video titled ‘What if Online Trolls Acted Like Trolls in Real Life:’ “This is why I think most websites should turn off their commenting sections. People say so much online that they never would face to face. And most of the political arguments would evaporate too, which would be great.”

People assume that comments — especially those posted in real time — threaten the health of a community as harassment, profanities and spam can be brought alive instantly, with the push of a button.

But while these concerns are all valid, they encourage the loss of profitable, on-site audience engagement. In reality, there’s actually a way to host real-time commenting tools on your owned and operated properties without damaging your brand.

The Uses and Gratification Theory introduces us to the idea that people actively seek out media to fulfill their needs for information, human connection and socialization. Commenting tools can help to satisfy this innate human need to socialize and connect with others, sparking healthy, social interaction online.

Breaking the Chain of Misinformation Around Commenting

Media organizations that have killed or rejected commenting sections have experienced and will continue to experience a massive loss of opportunity.

Yes, there are trolls running wild online, just itching to frustrate other people. So rather than letting them — along with countless other digital trouble-makers — ruin your engagement tools, all you need to do is put one simple measure in place to tame them: comment moderation.

Comment moderation can completely flip your commenting platform from destructive to profitable in a matter of moments. In fact, communities that had sophisticated moderation in place see significant on-site engagement growth: including 62% more user likes, 35% more comments per user and 34% more replies per user.

By creating a protected and social environment that users can engage with, you can begin building a loyal community that drives revenue.

Civil, live commenting platforms help to form an environment where visitors feel safe enough to participate in conversation. As they create meaningful discussions with others around content, their propensity to subscribe increases.

Comments also provide organizations with valuable audience engagement metrics.


Just as the previous post stated, these metrics can help organizations identify community behavior and content preferences, which can be used to improve editorial and subscription strategies. Take it a step further by making sure you’re getting first-party audience data from your commenting tools so you can gather actionable insights to help grow your community.

Setting Rules in Your Community

If you’re going to bring a commenting tool into your platform, you need to decide how strict your comment moderation should be.

A recent post on The Verge outlines the value of moderation in online communities. In the article, Twitch’s CEO, Emmett Shear, addresses the difference between allowing free speech and building a civil community online:

“I hope people can express themselves. I hope they can share their ideas, share their thoughts. But we’re not a platform for free speech. We are not upholding the First Amendment. That’s the government’s job. We’re a community. And communities have standards for how you have to behave inside that community. And so we think that it’s not anything goes.”

Free speech is important to society as a whole, but online, speech that disrupts a community’s overall health is toxic to a brand’s success. Which is why it’s so important to set community guidelines and enforce them throughout your engagement tools.

“[A community] with good, strong moderation, in many ways, is actually the place with freer speech,” says Shear. “Because it was actually the place where people could express themselves and not just get destroyed by trolls and abuse and harassment.”

Preventing toxicity on your platform actually forms an ideal environment for users to interact with one another in.

So here’s the bottom line: moderated commenting tools are absolute necessities to engage visitors, build communities and grow revenue.

Can’t afford to spare any resources to do the moderation in-house? You may want to look into a tool that offers automatic moderation services. You can review the different types of moderation here.

5 Ways to Decrease Trolling and Improve the Quality of Your Comments

With the prevalence of online trolls, some organizations have put up their hands and given up on the comment section. But doing so, even temporarily, has major drawbacks for organizations…

Last updated October 28th, 2019

Highlights:

  1. Reward users to encourage desired conversations
  2. Offer moderation tools to your users
  3. Use artificial intelligence in conjunction with your human efforts
  4. Quiz your users to weed out those who haven’t read the full story
  5. Stop anonymous comments

With the prevalence of online trolls, some organizations have put up their hands and given up on the comment section altogether. But doing so, even temporarily, has major drawbacks for organizations and their users.

As Carrie Lysenko Head of Digital for The Weather Network pointed out in an RTNDA Canada panel on engagement, turning off comments can result in a significant drop in pageviews and attention time. This echoes Viafoura’s own findings that brands with commenting can increase pageviews by 248% and attention time by 364%. This increased engagement leads to higher registrations and subscriptions since engaged users are more likely to pay for premium services.

And while managing online communities has traditionally been cumbersome and expensive, today there are many cost-effective ways to reduce or eliminate trolling. For media companies, these new tools allow you to not only keep your comment section open, but also to capitalize on your user-generated content.

Reward Users to Promote Civil Comments

Trusted-user badge

Encourage users to submit thoughtful comments by rewarding your best commenters with a trusted-user badge. With this status, an icon will appear beside the user’s name for others to see. These trusted users are also able to publish their comments in real time without being moderated.

Editor’s pick

Another way to reward users is by giving their comment the editor’s pick status. These comments can be featured in prominent positions on your website to model the types of comments you want to receive.

This is also beneficial for SEO, because comments that are placed higher on your webpage will get indexed by Google, and the keywords in those comments may be a closer match to users’ own search terms than those used by a journalist.

Create articles from users’ comments

Many organizations today including The New York Times, The Washington Post, and the Canadian Broadcasting Corporation (CBC) are creating stories entirely from their users’ comments. These stories not only reward commenters for their insightful posts, but are cost-effective, quick to publish and receive a surprisingly high amount of attention time and comments. Some even attract more comments than the original piece from which they were taken.

To see the impact of these articles, we tracked the number of comments for eight user-generated blog posts in CBC’s Revenge of the Comment Section, comparing those to the number of comments for their original articles.

The results are depicted in the chart below:

It’s significant to note that while almost all of the original stories received more comments, the user-generated articles often weren’t far behind. And in one instance, for Story 2, there were more comments for the user-generated article (601,000) than for its original article (343,000). Readers also spent approximately 2.3x more time on the former page.

That’s pretty fascinating since these articles can be created at a fraction of the time and cost it takes a journalist to create a new article from scratch.

Offer Content Moderation Tools to Your Users and Managers

Flagging

Allow users to easily flag comments that they find offensive, using a noticeable red flag icon. When a comment receives a predetermined amount of flags, it will enter a queue for review with a moderator who will decide the appropriate action.

Timed user banning

Give short “timeouts” as little as a few hours, days or months and notify users as to why they are being banned to help them improve the quality of their comments. Alternatively, users can be permanently banned for repeated offenses.

Dislike button

The dislike button allows users to express their dislike for a comment, without having to flag it (which requires a moderator’s time and resources). We found that this button can reduce flagging by 50% in as little as two weeks upon implementation.

Gamification

Both The New York Times and The Guardian have created games that allow readers to try moderating content. Users are tasked with approving or rejecting comments and providing reasoning for their decisions. This is not only enjoyable for users, but eases some of the burden on moderators.

Use AI Moderation to Eliminate Online Harassment

Whether your organization employs dedicated moderators or tasks other employees with removing the “trash,” you could be saving countless hours and dollars with automated moderation.

Automated moderation uses natural-language processing and artificial intelligence to automatically categorize and eliminate trolling, spam and online harassment.

Viafoura’s Automated Moderation is programmed with over six million variations of problematic words or phrases. This means that it’s able to determine both the subject matter and the sentiment behind users’ comments, detecting and eliminating spam, foul language, abuse, personal attacks and other uncivil comments before other users can even see them.

If the system encounters a new word or sentence that it’s unsure of, it flags the instance for a moderator to review. As a moderator approves or rejects new words, through the power of machine learning, the algorithm will learn the new rules and get smarter over time.

On average, our studies have found that automated moderation has a higher accuracy rate (92%) than human moderation (81%), and reduces 90% of the time and cost it takes to moderate a community manually.

Quiz Your Users

The Norwegian tech news website, NRKbeta, encourages thoughtful comments by asking their readers to prove they read the whole story by taking a quiz. Their organization believes that this quiz can weed out users who haven’t read the story, while also giving users time to reflect on how they will comment instead of just typing a response to a shocking headline.

Their reporter, Stale Grut, comments, “When a lot of journalists hit ‘publish’ I think that they see themselves finished with a story. But we see that you’re only halfway through with the article when you’ve published it.” Their goal is to improve articles through collaboration.

Many commenters agreed that this tactic would promote insightful comments. Here’s what they had to say:

“It WILL raise the discourse, and it will improve the journalism too. And why should some poor intern have to sit and delete all the trash? Let a computer do it.”
—Moira

“I would not object to that if it reduced the uninformed and off-topic as well as useless comments”
—Annette

End Anonymous Commenting

By allowing users to register for your website through one of their social media accounts, with the use of social login, they are less likely to post harassing comments because they can easily be identified.

The social login button also generally increases conversion rates by 20% to 40%, while giving you access to user information that can be used to create targeted messaging.

Increased Engagement = Higher Revenue

If you’re committed to improving the quality of interactions on your website, you may find that using moderators alone can be expensive and time-consuming. Luckily, today we can count on technology to encourage quality comments and eliminate the number of personal attacks. And by improving the quality of interactions on your site, you can look forward to increased engagement, improved brand loyalty and enhanced lifetime value from your users.

Need more help?

If you’re looking to drive engagement and leverage user-generated content, let’s connect.

Connect Now
Exit mobile version