Does your moderation team have the tools to spot the troll?

As any moderator knows, trolling is more than just a nuisance.

According to Leigh Adams, director of moderation services at Viafoura, ‘trolls become the vocal minority, and can quickly overtake and drown out more relevant conversation.’ When toxic behavior goes unchecked, it can lead forums in a direction that actively harms your brand. Are your moderators equipped to effectively identify and eliminate bad behavior before it causes long-term damage? Here are five important tools they’ll need.

Community Guidelines

The first step to building a safe online community is to create strong community guidelines that are clear, easy to find, and comprehensive. These serve as the first line of defense for would-be trolls.

What should guidelines contain? Experts recommend a no-tolerance policy for personal attacks, obscene, libelous, or defamatory content, and anything abusive or profane. After that, guidelines should be customized to fit your platform.

When a user behaves badly, community guidelines can help your moderators make an informed decision to flag or ban them. They also provide justification, should a banned user file a complaint. This helps to protect your moderators, which in turn helps them do their jobs with confidence.

Know the Signs

Tolls can be sorted into one of two categories; Single account and multiple account. Both types can be identified quickly and easily if moderators know what to look for. Ensure your moderation team knows how to access user data and search for warning signs.

Single account trolls are users who cross the line, whether they know it or not. They can be identified by volume and typically boast a high number of flags, disables, or comments. They can be redeemed, and often amend behavior with a warning or a single ban.Multi-account trolls will return repeatedly under different aliases. They’re typically seeking attention and often leave hints about their identity in order to reignite toxic conversations. Moderators should look at new accounts for telltale signs of a returning troll. They will often have a high disable rate, similar name, avatar image, or IP address to a previously banned account. Some trolls are so eager for attention, they may even post ‘I’m back’ or ‘I was banned’.



When all warnings have failed to stop bad behavior, moderators should be empowered to ban problematic users. Bans help uphold your community guidelines and make a better space for the well-behaved users that wish to enjoy the platform.

There are several kinds of bans; incremental, ghost bans, and permanent bans. Moderators can increase their level of control by understanding the best use scenario for each type of ban.

An incremental ban is used as a stop-gap or a warning to make sure first-time offenders are given room for improvement. Ghost bans allow the troll to continue posting, but they remove the comment from the general feed. Permanent bans are just that: permanent. Bans are a powerful tool. Empower your moderators to use them. Provided they are using it accurately, the confidence to ban users will save time and mental fortitude.

AI Tools

One of the best tools you can give your human moderation team is an AI system that will identify the majority of obvious toxicity, reduce their moderation scope and allow them to focus on more nuanced situations.

There are a number of intelligent moderation options available, but not all systems are made equal. Many services use a ‘banned word’ list that won’t catch the majority of contextual issues or ‘masking’ violations. Instead, choose a service with natural language processing or machine learning. These systems allow the AI to adapt as moderators approve or block comments, customizing the algorithm to your platform.

According to the Viafoura report ‘Everyone Is A Troll’, communities with advanced moderation software have been proven to see growth; 62% more user likes, 35% more comments per user, and 34% more replies per user.

Support for your moderation team

Your moderation team does the essential but difficult job of guarding your community against incoming hostility. Creating an efficient, positive, and healthy work environment will help them avoid burnout and maintain positivity.

The first and most important aspect of a healthy workplace is open communication. Set up a channel (like Slack or Google Meets) and encourage your moderators to reach out for help. This will help your team remain neutral, identify each other’s unconscious bias and ensure knowledge can be shared.

Support your moderation team further by keeping the workload transparent and providing access to frequent breaks. Rest can be productive, and when you’re dealing with constant negativity, essential.

At Viafoura, we believe that well-prepared moderators make for strong healthy communities. How many of these tools do your moderators employ? Equip your team with a complete range of moderation strategies and they can build a community that feels safe, supports your brand, and grows along with your platform.

Why turning off the comments is a threat, not a solution, for media companies

Trolls, spam and misinformation have given commenting spaces a bad reputation.

Websites that are flooded with offensive and untrustworthy comments can lose the respect of advertisers and users. Publishers often think that the only solution is to give up and close down their commenting tools.

But shutting off the comments isn’t a solution; it’s a catalyst for serious business problems.

The issue with dropping commenting from your website

The reality is that media companies suffer the second they get rid of their website’s social tools.

(Without comments, companies) lose a direct connection with their audience (and just provide) passive content for readers, as opposed to creating active opportunities for feedback and opinions,” says Mark Zohar, Viafoura’s president and COO. “That feedback loop between content, publisher and author is critical for high-performing content and re-engaging audiences.”

In a nutshell, media organizations need commenting tools to get closer to their communities and create better experiences for audience members and staff.

Companies that drop their comments aren’t solving anything; they’re just allowing their worst audience members to damage their brands. 

Throw in the fact that 50% of new user registrations happen on web pages with commenting tools, and it’s easy to see why social spaces are must-have website features for all publishers hoping to grow closer to their audiences.

Cupped human hands on a table with speech bubbles in the middle.

How to run safe and successful commenting spaces

Comment moderation is a publisher’s greatest weapon against offensive user behaviour. The importance of supporting any online social tools with advanced comment moderation services cannot be overstated — it’s what separates the safe, lucrative social spaces from those that are doomed to fail.

Media companies that pair their online commenting spaces with effective moderation give themselves the greatest chance to grow their audiences, customer loyalty and revenue without damaging their reputations.  

“People want to participate in communities where they feel safe,” Zohar explains. “We know from our data that communities and sites with active, positive moderation that’s civil generate engagement on-site.” 

When protected by Viafoura’s automated moderation services, our data shows that customers have seen engaged users spend 168 times more time on-site, gain up to 2,000 new monthly registrations, and view 3.6 times more pages than media companies without commenting tools. 

“Where (commenting) doesn’t happen, we see a drop-off in engagement,” adds Zohar.

Instead of ditching comments, media companies can draw on moderation to create safe environments that invite journalists, readers and commenters to communicate and connect with each other. 

Nervous that moderation might be too expensive to invest in? 

There are plenty of cost-effective AI-based and human moderation options available. You can also look for an engagement tool provider that includes moderation services directly in their commenting solution for an affordable, hassle-free experience.

Get rewarded with user data

Newsrooms don’t get much value from sending content into a void, where they never hear about it again. Moderated commenting tools give journalists the chance to have positive conversations about their content, get feedback about it from their registered readers, and use that information to make content even more compelling in the future. 

This means that as registered users leave comments on your site, you can expand your user data beyond their general profiles to include information on audience behaviours, interests, sentiments, propensity and purchase intent. 

Once you have that declarative data, you can feed it into your business model. 

“Allowing for users to communicate directly with you and (other readers) around content creates insights, (leading) to rich user profiles that evolve over time as they participate actively in the community,” says Zohar. “By understanding user behaviour on-site as well as user interest and propensity… publishers can improve things like newsletter curation, sign-ups and target users for subscriptions.”

The more first-party data you can get from commenters, the better you can group like-minded users together to personalize their experiences, send them subscription messages and show them relevant ads.  

In other words, your commenting solution has the potential to give you an edge over your competitors. So whatever you do, don’t turn off the comments!

Want to know more about our commenting and engagement solutions? Click here to check out our product suite.

Exit mobile version