Does your moderation team have the tools to spot the troll?

As any moderator knows, trolling is more than just a nuisance.

According to Leigh Adams, director of moderation services at Viafoura, ‘trolls become the vocal minority, and can quickly overtake and drown out more relevant conversation.’ When toxic behavior goes unchecked, it can lead forums in a direction that actively harms your brand. Are your moderators equipped to effectively identify and eliminate bad behavior before it causes long-term damage? Here are five important tools they’ll need.

Community Guidelines

The first step to building a safe online community is to create strong community guidelines that are clear, easy to find, and comprehensive. These serve as the first line of defense for would-be trolls.

What should guidelines contain? Experts recommend a no-tolerance policy for personal attacks, obscene, libelous, or defamatory content, and anything abusive or profane. After that, guidelines should be customized to fit your platform.

When a user behaves badly, community guidelines can help your moderators make an informed decision to flag or ban them. They also provide justification, should a banned user file a complaint. This helps to protect your moderators, which in turn helps them do their jobs with confidence.

Know the Signs

Tolls can be sorted into one of two categories; Single account and multiple account. Both types can be identified quickly and easily if moderators know what to look for. Ensure your moderation team knows how to access user data and search for warning signs.

Single account trolls are users who cross the line, whether they know it or not. They can be identified by volume and typically boast a high number of flags, disables, or comments. They can be redeemed, and often amend behavior with a warning or a single ban.Multi-account trolls will return repeatedly under different aliases. They’re typically seeking attention and often leave hints about their identity in order to reignite toxic conversations. Moderators should look at new accounts for telltale signs of a returning troll. They will often have a high disable rate, similar name, avatar image, or IP address to a previously banned account. Some trolls are so eager for attention, they may even post ‘I’m back’ or ‘I was banned’.

 

Banning

When all warnings have failed to stop bad behavior, moderators should be empowered to ban problematic users. Bans help uphold your community guidelines and make a better space for the well-behaved users that wish to enjoy the platform.

There are several kinds of bans; incremental, ghost bans, and permanent bans. Moderators can increase their level of control by understanding the best use scenario for each type of ban.

An incremental ban is used as a stop-gap or a warning to make sure first-time offenders are given room for improvement. Ghost bans allow the troll to continue posting, but they remove the comment from the general feed. Permanent bans are just that: permanent. Bans are a powerful tool. Empower your moderators to use them. Provided they are using it accurately, the confidence to ban users will save time and mental fortitude.

AI Tools

One of the best tools you can give your human moderation team is an AI system that will identify the majority of obvious toxicity, reduce their moderation scope and allow them to focus on more nuanced situations.

There are a number of intelligent moderation options available, but not all systems are made equal. Many services use a ‘banned word’ list that won’t catch the majority of contextual issues or ‘masking’ violations. Instead, choose a service with natural language processing or machine learning. These systems allow the AI to adapt as moderators approve or block comments, customizing the algorithm to your platform.

According to the Viafoura report ‘Everyone Is A Troll’, communities with advanced moderation software have been proven to see growth; 62% more user likes, 35% more comments per user, and 34% more replies per user.

Support for your moderation team

Your moderation team does the essential but difficult job of guarding your community against incoming hostility. Creating an efficient, positive, and healthy work environment will help them avoid burnout and maintain positivity.

The first and most important aspect of a healthy workplace is open communication. Set up a channel (like Slack or Google Meets) and encourage your moderators to reach out for help. This will help your team remain neutral, identify each other’s unconscious bias and ensure knowledge can be shared.

Support your moderation team further by keeping the workload transparent and providing access to frequent breaks. Rest can be productive, and when you’re dealing with constant negativity, essential.

At Viafoura, we believe that well-prepared moderators make for strong healthy communities. How many of these tools do your moderators employ? Equip your team with a complete range of moderation strategies and they can build a community that feels safe, supports your brand, and grows along with your platform.

The Twitter takeover — another reason to build engaged and active communities on your owned and operated properties

Monday April 25th, 2022, Twitter’s board accepted billionaire Elon Musk’s offer to buy the social media company and take it private. The announcement ends what can only be called a weeks-long media firestorm as Musk offered to buy the company for $44 billion. Twitter stockholders will receive $54.20 for each share of common stock — a significant premium over the stock’s price from just months earlier.

Musk has often referred to himself as a “free speech supporter” and has been a loud critic of content moderation policies put in place by organizations, like Twitter, to stem the flow of misinformation, enforce authenticity and prevent harassment.

Musk also seems to believe that he’s advancing the free speech movement by taking over the social platform. For instance, he claims that he wants “to make Twitter better than ever by enhancing the product with new features, making the algorithms open source to increase trust, defeating the spam bots and authenticating all humans.”

Generally, the news has raised eyebrows.

Between Musk’s recent statements and the implied return of users currently banned from the platform, many believe he’s bound to run into conflict with multiple regulators. Now, Thierry Breton, the European Union’s commissioner for the internal market, has warned Elon Musk that Twitter must follow the rules on moderating illegal and harmful content online.

What does this mean to publishers dependent on social media platforms like Twitter? According to Musk, he plans to have less content moderation on Twitter. This means that publishers will soon be at the mercy of his social media strategies, which will be based on his own definition of truthful or accurate news and a free-sharing audience.

The bottom line is that publishers must be in control of their community guidelines and content moderation. In other words, they need to be in a position where they can protect against misinformation and personal attacks on their journalists.

For this reason, publishers need to invest in building their communities and audience conversations away from social media. After all, there’s no better way to keep audience engagement where it belongs — directly on publisher-owned websites!

Many digital publishers have already started moving to adopt on-site engagement strategies and solutions, including real-time conversations and live Q&As, to grow audiences, gather first-party data and ultimately drive sustainable monetization. However, Elon Musk’s purchase of Twitter has highlighted the need to accelerate that strategy.

Rest assured that wherever Twitter goes from here, Viafoura will be ready to clear you a path for building an engaged and safe online community.

4 ways to know if the comment moderation solution you need is also aligned with your editorial brand

Choosing the right moderation solution can be challenging, and many organizations find that their current moderation solution isn’t up to the standards of their brand. When your comment moderation solution is not aligned with your brand, it reflects poorly on you and alienates your user community. 

If you want to build a thriving brand, you need to offer an exceptional experience for your audience. That means not settling for mediocre moderation and having a community engagement solution with a full suite of tools at your disposal to moderate your community, including shadow banning, IP lookup, troll management, likes, and follows. 

A comment moderation solution that’s truly aligned with your brand doesn’t just seamlessly blend in with your environment; it also reflects your brand’s value and enhances your business.

Research shows that when you implement engagement solutions across your platform, anonymous users spend more time on your site and become 25.4 times more likely to convert. 

This article will examine some of the core features and attributes of an on-brand moderation solution that can protect your community, your newsroom, and your brand as you grow over the long term. 

1. Predictive analytics

Using a solution with predictive analytics is vital for gaining better insights into your community, so you know and understand what matters to them most. Without it, your content strategy will be based on guesswork. 

Your ability to offer relevant content and experiences to users will determine the strength of your brand. If you’re a brand that offers up-to-the-minute coverage on topics that interest users, they’re going to engage with your brand more than they would if you offer them stories that are better suited to another target audience.

2. Are you working with a vendor or a partner?

If you’re looking for a solution that has the capacity to evolve with your brand long-term, then you need to ensure you’re working with a partner rather than a vendor. While a vendor will place ads across digital assets to maximize your online visibility and offer revenue share, they will treat you as more of a financial investment than a client.  

A true partner will work alongside you on a SaaS payment model to help you innovate new strategies that drive registrations, and acquire unique user data that allows you to enhance your brand and the way you serve customers.

Group of celebrating business partners

3. Automated moderation

When building a user community on your website, you need to have a strategy to deal with toxicity if you want to protect your users and your brand. Failure to moderate toxic comments can be extremely damaging to your organization’s reputation. 

For instance, Twitter’s inability to deal with hateful comments has damaged the organization’s brand by having users call the platform out for being a haven for toxicity, with Amnesty International going as far as branding the site “a toxic place for women.” 

As a result, it’s essential to have a chat room with automated moderation to ensure that you can keep the conversation free of abuse, harassment, hate, and uncivil comments in real-time.

It’s important to remember that a quality moderation solution isn’t a banned word list; it’s a complete AI-driven solution with semantic moderation that can infer the intent and meaning of uncivil comments independently.

4. First-party data collection

Any effective community engagement and moderation solution should have the ability to gather first-party data. 

Deploying an engagement tool that can collect first-party data is vital to making sure that you can develop detailed insights into your audience, which you can use to offer personalized content recommendations and news feeds that keep them engaged. 

For example, simply offering your users a personalized news feed can help you generate 3.15 more page views.

By collecting first-party data, you can identify what topics users are interested in, what authors they’re most likely to follow, and recommend pieces that are not just likely to engage them on the site but that are also going to interest them.

Elevating your brand with comment moderation

A comment moderation solution that is aligned with your brand will elevate the user experience and make your audience trust you even more. 

Features like AI-driven predictive analytics, first-party data collection and automated moderation give you a strong foundation to start building a safe and thriving user community.

Anything less, and you run the risk of offering a poorly optimized, irrelevant, and toxic community experience for your users and your journalists. 

Exit mobile version