Last updated October 28th, 2019
- Reward users to encourage desired conversations
- Offer moderation tools to your users
- Use artificial intelligence in conjunction with your human efforts
- Quiz your users to weed out those who haven’t read the full story
- Stop anonymous comments
With the prevalence of online trolls, some organizations have put up their hands and given up on the comment section altogether. But doing so, even temporarily, has major drawbacks for organizations and their users.
As Carrie Lysenko — Head of Digital for The Weather Network — pointed out in an RTNDA Canada panel on engagement, turning off comments can result in a significant drop in pageviews and attention time. This echoes Viafoura’s own findings that brands with commenting can increase pageviews by 248% and attention time by 364%. This increased engagement leads to higher registrations and subscriptions since engaged users are more likely to pay for premium services.
And while managing online communities has traditionally been cumbersome and expensive, today there are many cost-effective ways to reduce or eliminate trolling. For media companies, these new tools allow you to not only keep your comment section open, but also to capitalize on your user-generated content.
Reward Users to Promote Civil Comments
Encourage users to submit thoughtful comments by rewarding your best commenters with a trusted-user badge. With this status, an icon will appear beside the user’s name for others to see. These trusted users are also able to publish their comments in real time without being moderated.
Another way to reward users is by giving their comment the editor’s pick status. These comments can be featured in prominent positions on your website to model the types of comments you want to receive.
This is also beneficial for SEO, because comments that are placed higher on your webpage will get indexed by Google, and the keywords in those comments may be a closer match to users’ own search terms than those used by a journalist.
Create articles from users’ comments
Many organizations today — including The New York Times, The Washington Post, and the Canadian Broadcasting Corporation (CBC) — are creating stories entirely from their users’ comments. These stories not only reward commenters for their insightful posts, but are cost-effective, quick to publish and receive a surprisingly high amount of attention time and comments. Some even attract more comments than the original piece from which they were taken.
To see the impact of these articles, we tracked the number of comments for eight user-generated blog posts in CBC’s Revenge of the Comment Section, comparing those to the number of comments for their original articles.
The results are depicted in the chart below:
It’s significant to note that while almost all of the original stories received more comments, the user-generated articles often weren’t far behind. And in one instance, for Story 2, there were more comments for the user-generated article (601,000) than for its original article (343,000). Readers also spent approximately 2.3x more time on the former page.
That’s pretty fascinating since these articles can be created at a fraction of the time and cost it takes a journalist to create a new article from scratch.
Offer Content Moderation Tools to Your Users and Managers
Allow users to easily flag comments that they find offensive, using a noticeable red flag icon. When a comment receives a predetermined amount of flags, it will enter a queue for review with a moderator who will decide the appropriate action.
Timed user banning
Give short “timeouts” — as little as a few hours, days or months — and notify users as to why they are being banned to help them improve the quality of their comments. Alternatively, users can be permanently banned for repeated offenses.
The dislike button allows users to express their dislike for a comment, without having to flag it (which requires a moderator’s time and resources). We found that this button can reduce flagging by 50% in as little as two weeks upon implementation.
Both The New York Times and The Guardian have created games that allow readers to try moderating content. Users are tasked with approving or rejecting comments and providing reasoning for their decisions. This is not only enjoyable for users, but eases some of the burden on moderators.
Use AI Moderation to Eliminate Online Harassment
Whether your organization employs dedicated moderators or tasks other employees with removing the “trash,” you could be saving countless hours and dollars with automated moderation.
Automated moderation uses natural-language processing and artificial intelligence to automatically categorize and eliminate trolling, spam and online harassment.
Viafoura’s Automated Moderation is programmed with over six million variations of problematic words or phrases. This means that it’s able to determine both the subject matter and the sentiment behind users’ comments, detecting and eliminating spam, foul language, abuse, personal attacks and other uncivil comments before other users can even see them.
If the system encounters a new word or sentence that it’s unsure of, it flags the instance for a moderator to review. As a moderator approves or rejects new words, through the power of machine learning, the algorithm will learn the new rules and get smarter over time.
On average, our studies have found that automated moderation has a higher accuracy rate (92%) than human moderation (81%), and reduces 90% of the time and cost it takes to moderate a community manually.
Quiz Your Users
The Norwegian tech news website, NRKbeta, encourages thoughtful comments by asking their readers to prove they read the whole story by taking a quiz. Their organization believes that this quiz can weed out users who haven’t read the story, while also giving users time to reflect on how they will comment instead of just typing a response to a shocking headline.
Their reporter, Stale Grut, comments, “When a lot of journalists hit ‘publish’ I think that they see themselves finished with a story. But we see that you’re only halfway through with the article when you’ve published it.” Their goal is to improve articles through collaboration.
Many commenters agreed that this tactic would promote insightful comments. Here’s what they had to say:
“It WILL raise the discourse, and it will improve the journalism too. And why should some poor intern have to sit and delete all the trash? Let a computer do it.”
“I would not object to that if it reduced the uninformed and off-topic as well as useless comments”
End Anonymous Commenting
By allowing users to register for your website through one of their social media accounts, with the use of social login, they are less likely to post harassing comments because they can easily be identified.
The social login button also generally increases conversion rates by 20% to 40%, while giving you access to user information that can be used to create targeted messaging.
Increased Engagement = Higher Revenue
If you’re committed to improving the quality of interactions on your website, you may find that using moderators alone can be expensive and time-consuming. Luckily, today we can count on technology to encourage quality comments and eliminate the number of personal attacks. And by improving the quality of interactions on your site, you can look forward to increased engagement, improved brand loyalty and enhanced lifetime value from your users.