As any moderator knows, trolling is more than just a nuisance.
According to Leigh Adams, director of moderation services at Viafoura, ‘trolls become the vocal minority, and can quickly overtake and drown out more relevant conversation.’ When toxic behavior goes unchecked, it can lead forums in a direction that actively harms your brand. Are your moderators equipped to effectively identify and eliminate bad behavior before it causes long-term damage? Here are five important tools they’ll need.
Community Guidelines
The first step to building a safe online community is to create strong community guidelines that are clear, easy to find, and comprehensive. These serve as the first line of defense for would-be trolls.
What should guidelines contain? Experts recommend a no-tolerance policy for personal attacks, obscene, libelous, or defamatory content, and anything abusive or profane. After that, guidelines should be customized to fit your platform.
When a user behaves badly, community guidelines can help your moderators make an informed decision to flag or ban them. They also provide justification, should a banned user file a complaint. This helps to protect your moderators, which in turn helps them do their jobs with confidence.
Know the Signs
Tolls can be sorted into one of two categories; Single account and multiple account. Both types can be identified quickly and easily if moderators know what to look for. Ensure your moderation team knows how to access user data and search for warning signs.
Single account trolls are users who cross the line, whether they know it or not. They can be identified by volume and typically boast a high number of flags, disables, or comments. They can be redeemed, and often amend behavior with a warning or a single ban.Multi-account trolls will return repeatedly under different aliases. They’re typically seeking attention and often leave hints about their identity in order to reignite toxic conversations. Moderators should look at new accounts for telltale signs of a returning troll. They will often have a high disable rate, similar name, avatar image, or IP address to a previously banned account. Some trolls are so eager for attention, they may even post ‘I’m back’ or ‘I was banned’.
Banning
When all warnings have failed to stop bad behavior, moderators should be empowered to ban problematic users. Bans help uphold your community guidelines and make a better space for the well-behaved users that wish to enjoy the platform.
There are several kinds of bans; incremental, ghost bans, and permanent bans. Moderators can increase their level of control by understanding the best use scenario for each type of ban.
An incremental ban is used as a stop-gap or a warning to make sure first-time offenders are given room for improvement. Ghost bans allow the troll to continue posting, but they remove the comment from the general feed. Permanent bans are just that: permanent. Bans are a powerful tool. Empower your moderators to use them. Provided they are using it accurately, the confidence to ban users will save time and mental fortitude.
AI Tools
One of the best tools you can give your human moderation team is an AI system that will identify the majority of obvious toxicity, reduce their moderation scope and allow them to focus on more nuanced situations.
There are a number of intelligent moderation options available, but not all systems are made equal. Many services use a ‘banned word’ list that won’t catch the majority of contextual issues or ‘masking’ violations. Instead, choose a service with natural language processing or machine learning. These systems allow the AI to adapt as moderators approve or block comments, customizing the algorithm to your platform.
According to the Viafoura report ‘Everyone Is A Troll’, communities with advanced moderation software have been proven to see growth; 62% more user likes, 35% more comments per user, and 34% more replies per user.
Support for your moderation team
Your moderation team does the essential but difficult job of guarding your community against incoming hostility. Creating an efficient, positive, and healthy work environment will help them avoid burnout and maintain positivity.
The first and most important aspect of a healthy workplace is open communication. Set up a channel (like Slack or Google Meets) and encourage your moderators to reach out for help. This will help your team remain neutral, identify each other’s unconscious bias and ensure knowledge can be shared.
Support your moderation team further by keeping the workload transparent and providing access to frequent breaks. Rest can be productive, and when you’re dealing with constant negativity, essential.
At Viafoura, we believe that well-prepared moderators make for strong healthy communities. How many of these tools do your moderators employ? Equip your team with a complete range of moderation strategies and they can build a community that feels safe, supports your brand, and grows along with your platform.