Does your moderation team have the tools to spot the troll?

As any moderator knows, trolling is more than just a nuisance.

According to Leigh Adams, director of moderation services at Viafoura, ‘trolls become the vocal minority, and can quickly overtake and drown out more relevant conversation.’ When toxic behavior goes unchecked, it can lead forums in a direction that actively harms your brand. Are your moderators equipped to effectively identify and eliminate bad behavior before it causes long-term damage? Here are five important tools they’ll need.

Community Guidelines

The first step to building a safe online community is to create strong community guidelines that are clear, easy to find, and comprehensive. These serve as the first line of defense for would-be trolls.

What should guidelines contain? Experts recommend a no-tolerance policy for personal attacks, obscene, libelous, or defamatory content, and anything abusive or profane. After that, guidelines should be customized to fit your platform.

When a user behaves badly, community guidelines can help your moderators make an informed decision to flag or ban them. They also provide justification, should a banned user file a complaint. This helps to protect your moderators, which in turn helps them do their jobs with confidence.

Know the Signs

Tolls can be sorted into one of two categories; Single account and multiple account. Both types can be identified quickly and easily if moderators know what to look for. Ensure your moderation team knows how to access user data and search for warning signs.

Single account trolls are users who cross the line, whether they know it or not. They can be identified by volume and typically boast a high number of flags, disables, or comments. They can be redeemed, and often amend behavior with a warning or a single ban.Multi-account trolls will return repeatedly under different aliases. They’re typically seeking attention and often leave hints about their identity in order to reignite toxic conversations. Moderators should look at new accounts for telltale signs of a returning troll. They will often have a high disable rate, similar name, avatar image, or IP address to a previously banned account. Some trolls are so eager for attention, they may even post ‘I’m back’ or ‘I was banned’.

 

Banning

When all warnings have failed to stop bad behavior, moderators should be empowered to ban problematic users. Bans help uphold your community guidelines and make a better space for the well-behaved users that wish to enjoy the platform.

There are several kinds of bans; incremental, ghost bans, and permanent bans. Moderators can increase their level of control by understanding the best use scenario for each type of ban.

An incremental ban is used as a stop-gap or a warning to make sure first-time offenders are given room for improvement. Ghost bans allow the troll to continue posting, but they remove the comment from the general feed. Permanent bans are just that: permanent. Bans are a powerful tool. Empower your moderators to use them. Provided they are using it accurately, the confidence to ban users will save time and mental fortitude.

AI Tools

One of the best tools you can give your human moderation team is an AI system that will identify the majority of obvious toxicity, reduce their moderation scope and allow them to focus on more nuanced situations.

There are a number of intelligent moderation options available, but not all systems are made equal. Many services use a ‘banned word’ list that won’t catch the majority of contextual issues or ‘masking’ violations. Instead, choose a service with natural language processing or machine learning. These systems allow the AI to adapt as moderators approve or block comments, customizing the algorithm to your platform.

According to the Viafoura report ‘Everyone Is A Troll’, communities with advanced moderation software have been proven to see growth; 62% more user likes, 35% more comments per user, and 34% more replies per user.

Support for your moderation team

Your moderation team does the essential but difficult job of guarding your community against incoming hostility. Creating an efficient, positive, and healthy work environment will help them avoid burnout and maintain positivity.

The first and most important aspect of a healthy workplace is open communication. Set up a channel (like Slack or Google Meets) and encourage your moderators to reach out for help. This will help your team remain neutral, identify each other’s unconscious bias and ensure knowledge can be shared.

Support your moderation team further by keeping the workload transparent and providing access to frequent breaks. Rest can be productive, and when you’re dealing with constant negativity, essential.

At Viafoura, we believe that well-prepared moderators make for strong healthy communities. How many of these tools do your moderators employ? Equip your team with a complete range of moderation strategies and they can build a community that feels safe, supports your brand, and grows along with your platform.

Moderation in a Time of Unprecedented Change Panel Discussion

On August 20th, Mike Blinder (Owner & Publisher, Editor and Publisher Magazine) was joined in conversation by Leigh Adams (Director of Moderation Services, Viafoura) and Dustin Block (Audience Development Lead, Graham Media) to discuss the importance of moderation in an increasingly online world.

A Changing Landscape

Our three experts shared dialogue and exchanged stories over how the move into a more digital world, where millions of people are working from home and looking online for news has elevated the importance of community engagement and moderation.

“[We’ve] moved into a digital world, Covid brought millions of people home either through unemployment or job loss or through remote work…people are online more than ever before..forcing these important conversations to happen in this digital space” – Leigh Adams, Director Moderation Services, Viafoura.

 

With hot button issues such as coronavirus, social injustices, and political tension all taking over the zeitgeist, it sets the perfect storm for trolls to spread misinformation and cause unrest.

 

“It makes a perfect storm for trolls because this is exactly the kind of inflammatory conversations they want to be involved in. They want to prey on the misinformed, they want to prey on people’s insecurities – make them more afraid. And media brands simply can’t afford to walk away from the comments anymore.” – Leigh Adams

 

Dealing with these digital trolls has become an essential part of any online publication. Media brands can no longer afford to ignore these issues as it isn’t uncommon for advertisers to walk away due to not wanting to be associated with the content found within a comments section. It’s also been shown that having a more welcoming, engaging comment space is more likely to grow your loyal audience. These loyal users are looking for a place to engage with others and are far more likely to return, watch videos, or click on advertisements.

 

“From an audience development perspective… that loyal audience, these registered users, they vastly outperform our anonymous audience. I spend a lot of time creating features for this registered audience. They watch video more often, they click ads more often, the come back to site the more often…and one of the main features that this group does is comment” – Dustin Block, Audience Development Lead, Graham Media

Making Your Digital Properties Brand Safe

The panel agreed that artificial intelligence moderation can cut a lot of spam and obviously toxic comments down, however, should be supplemented with some human intervention. People are clever and can find ways around scripted filters.

 

“It’ a marvel at how clever people can be about getting some sort of offensive or off-topic remark to be included” – Dustin Block, Graham Media

 

Instead, the best practice is to leverage some sort of automation, to cut down the volume of comments, in conjunction with a moderation team who are following externalized guidelines that remove the risk of bias while creating a safe space where anyone would feel comfortable engaging with others.

 

“If you’ve externalized your brand guidelines, you’ve said this is what we want our comments to be, these are the type of comments that we do want, these are the type of comments we do not want. And you’ve codified them and shared them in a way that’s outside the journalist’s hands, I think that’s going to make for much more effective moderation.” – Leigh Adams, Viafoura

 

Media companies also shouldn’t fear disciplining their users. For a long time, there’s been a fear that removing comments or banning users would hurt your audience members. The truth however is that someone who leaves hundreds of disruptive, antagonistic comments isn’t going to be a valuable member of your community and is more likely to turn others away from engaging themselves. There’s no freedom of speech within the private sphere, and that includes web forums. Those who aren’t following your brand’s community guidelines should be swiftly dealt with.

That being said, it is important to not penalize the misinformed. Let natural conversations happen, a healthy discussion or debate in the comments section can help expose people to new information. The healthiest online communities are the ones where loyal users respond to the potential trolls and challenge their claims with facts and an open dialogue.

 

“I work with our Trust Index which is a team that works on misinformation. Responding with links to articles to galvanize the community to counter bad information. – Dustin, Graham Media

 

One of the challenges of human moderation that the panel discussed was constantly learning new language. It’s important to know what people are talking about, the common terms being used, and what each acronym means.

 

“We’re applying [these learnings] across all of our clients and we’re making sure that’s [being shared out] not just the brand we’re noticing it on but all of our teams” -Leigh

 

Moderators should be trained not only on language but also on elevating the positive rather than just removing the negative. Viafoura’s “Editor’s Picks” tool of highlighting particularly excellent comments was mentioned as an example of how to reward positive contributions from your community. It’s also important that moderators and journalists who are responding to comments know the boundaries of their community guidelines and can keep an informative tone and avoid sarcasm or dismissive language while responding.

Making Journalism Better With User Generated Content

Part of the public service job of journalism is to educate people, and community moderation fits into that bubble as well. While it can at times feel exhausting reading hate speech in the comments section, it was pointed out that most moderators realise they’re making the digital world better for others with this work. If a moderator has to read a hateful comment so that hundreds of others don’t, it helps protect marginalized people from being hurt further and find comfort in a safer online space.

Lastly, the panel was eager to point out how good moderation can grow your community and ultimately help journalists find better stories. One excellent example was given of a commenter who pointed out that the cars involved in a road accident were actually $250,000 luxury vehicles which put a whole new spin on the story.

 

“I love comments for many reasons one: it’s just a place to talk to your audience. And they’ll share so much through those comments…you can make your reporting better, and you can better inform your audience just by engaging.”  Dustin Block, Graham Media

 

In general, there’s a higher quality of discussion to be found on your own website over social platforms like Facebook and Instagram. Companies should look to stop relying on these larger platforms for engagement and instead look to bring users into their own community and give them a positive experience where they’re more likely to subscribe, click ads, and interact with others.

Engagement is the future of journalism. Gone are the days of reporters simply talking at their audience, and now an open dialogue between the organization and its community is the expected norm. In this unprecedented shift into an increasingly online landscape, quality moderation is essential for those looking to grow their communities and become a landing spot for those looking to discuss, analyze, and share online.

Exit mobile version