Rebuilding Trust: Convincing Skeptical News Readers to Convert

Spotlight: 

  • Trust in media is dwindling, with more than half of news consumers (58%) skeptical of media companies. 
  • Although 50% of people lose interest in content near toxic behavior, you can moderate on-site social spaces to build trust around content.
  • Since news sites are 25% more trusted than social media, news organizations should focus on building relationships with audiences on their own properties.
  • Subject matter experts can participate in the content creation process to boost brand authority over specific topics. 
  • Media companies can integrate first-party data into their content strategies to convince news readers to trust their brands for relevant content and experiences.
  • Community engagement tools, like Viafoura’s Engagement Suite, paired with trust-building strategies can help companies establish authority and increase conversions.

Misinformation from global leaders and social media in the past has severely damaged people’s trust in content online. As a result, a large portion of the public is extremely critical of content created by media organizations. 

Edelman’s 2021 Trust Barometer report reveals that more than half of consumers no longer trust the media. 

In fact, Edelman’s study highlights that 58% of people think that “most news organizations are more concerned with supporting an ideology or political position than with informing the public.”

But trust is an essential ingredient that media companies need if they want to develop long-lasting relationships with audiences. 

To convince people to become registered users, community members and paying subscribers, media companies must rebuild their reputations as trustworthy resources. 

Uncover a few practical ways you can encourage skeptical audiences to trust your company for news content and safe digital experiences below.

Moderate Your Company’s Discussion Spaces

Social spaces help media companies engage and convert people around their content, but these areas can also become overridden with trolls, misinformation, and offensive comments without constant moderation. 

And unfortunately, a toxic environment can undermine any trustworthy content or experiences hosted within it.

If you want to protect your audience members and reinforce the safe, trusted nature of your business, comment moderation is a necessity for your on-site discussion spaces. Especially since 50% of people that experience toxic behavior online will lose interest in nearby content. 

Implementing an effective comment moderation solution on your company’s website or app will show audiences that they can trust your organization for positive content experiences.

Reduce Your Dependence on Social Media

While most media companies use social media to reach and engage audiences, a significant amount of people no longer trust social media. This means any content your organization posts on social media won’t be seen as reliable information by default.

A report from the Reuters Institute outlines how sources on social media “[seem] interchangeable, making it difficult to discern where stories originated online and even [undermine] trust in the information environment more generally.”

It’s no surprise that, on average, people trust information on news sites 25% more than on social media. 

The bottom line is that social media has earned itself a reputation for creating data leaks, rewarding misinformation in its algorithms and failing to moderate offensive behavior. And the only way to disassociate yourself from social media’s negative reputation and establish trust with readers is to engage them right on your own properties. 

Consider implementing on-site commenting tools and providing personalized reader experiences on your website or app to build trust and nurture audience relationships.

Feature Subject Matter Experts in Content

There’s nothing that can build credibility around a particular topic quite as well as a qualified subject matter expert (SME) can. So to develop your organization’s authority over a specific subject, you can interview and quote SMEs in your content or even commission them to write a piece for your company.

Scientific American, for instance, enlists the help of practitioners to create about half of its content to help educate people as much as possible. 

“[Scientific] topics, financial advice, legal advice, tax advice, advice pages on topics such as home remodeling… or advice on parenting issues…. should also come from “expert” or experienced sources that users can trust,” explains Jonathan Hedger, the marketing director for an app that connects people to freelance SMEs. 

Since SMEs have dedicated their careers and lives to learning about a particular subject, their voice often carries more authority and trust than an average journalist.

Create Content and Experiences Based on Unique Reader Interests

What’s your company doing to ensure readers can trust it for content that’s captivating and relevant? 

Give readers the content and experiences they want by using your first-party data to understand their preferences. From there, you can adjust your company’s content and on-site experiences to align with reader interests. 

“If you understand who your audience is… it allows you to [optimize] your offering around them,” says Robbie Kellman Baxter, a consultant and expert on subscriptions. “It allows you to keep [their interest] for a long time, build loyalty and trust, and expand the relationship over time.” 

You can partner with Viafoura to host engaging experiences for readers across your digital properties and collect critical first-party engagement and behavioral data. 

At the end of the day, trust is easy to lose and challenging to regain. 

But with the proper support and trust-building strategies in place, you can convert skeptical news readers into loyal, paying community members.

Civilize & Monetize: How to Highlight Constructive vs. Toxic Comments

Spotlight:

  1. A moderation strategy keeps your digital spaces safe, but should also include how to drive positive conversations
  2. Positive, active communities are lucrative for publishers
    1. 80% of all user registrations are triggered on pages that feature onsite engagement tools and user-generated content.
    2. Registered users spend 225% more time-consuming media content per week.*
      1. *Viafoura study data collected Jan 2019 – May 2019, sampled 14 unique media brands, sampled 85M unique non-registered and 2.5M registered users
  3. Help spark the conversation, pose thoughtful questions
  4. Highlight positive community members, not only does it encourage more engagement, it shows the kind of engagement you’re looking for
  5. Community guidelines must be clear and easily accessible to your community

Online social spaces exist to build value for your community members, generating engagement and loyalty toward your brand. And these tools are most effective when media organizations support them with the right strategies that maximize user activity. 

A moderation strategy, for example, is necessary for businesses to keep their digital social spaces civil. But media companies must go beyond reducing offensive comments and trolls in their social spaces to facilitate ideal behavior from audience members.

“Just as it’s important to have a strategy in place to protect the quality of your on-site social spaces, it’s also crucial to have a strategy to drive positive conversation around your content,” says Leigh Adams, director of moderation solutions at Viafoura. 

In other words, if you want to drive positive behavior from your community members, you’ll need to take action by encouraging users to participate in meaningful, on-topic discussions.

The Value of Activating Productive Discussion

In the digital world, active communities are incredibly lucrative for media companies. 

You can draw on your users’ engagement and behavioral data to personalize their experience and increase your website or app’s appeal to advertisers. Plus, you can assess what people post about to determine the types of content that will resonate the most with your online community. 

Keep in mind that digital conversation tools also connect people together, forming long-lasting relationships that are tied to your brand. 

But you can’t assume that people will become active, model community members unless they understand what kind of discussions and behaviors are expected of them. 

Media organizations that take the time to outline and promote what positive behavior looks like for their communities will be well-positioned to grow attention levels, memberships and various revenue streams. 

“The most thriving and profitable online communities are often the ones where positive behavior is encouraged, demonstrated and rewarded by the community host,” Adams explains. “No matter how intuitive and engaging your moderation tools are, you need to develop a battle plan if you want to activate your community effectively.”

Ultimately, media organizations can take a few simple steps to ensure their communities are overflowing with activity:

Pose Questions for Users To Answer

If you want to increase activity from your digital community, you can spark discussions and debates by showcasing thought-provoking questions.

“One of the best ways to keep discussions in your commenting section buzzing and on-topic is to give your readers a prompt through a question that relates to your content,” Adams states. “Not everyone will understand how you want them to behave if you don’t give them some level of guidance to follow.” 

Consider highlighting questions for community members at the bottom of your content piece. Or, you can pin them as posts within your commenting widget.

Interacting with your community members, even by just getting a conversation started, will give users the direction they need to post comments that are on-topic and positive.

Reward and Highlight Model Behavior

Your community members play a significant role in the success of your business. So how are you taking the time to reward your most active users? 

“By rewarding commenters for positive and productive contributions, you’re incentivizing your most loyal supporters to continue participating in your digital community,” says Adams.

There are several ways you can show readers that you appreciate good behavior. 

You can give top contributors badges based on their participation levels, pin their comments to the top of your commenting widget, invite them to help you moderate live events and send them an email to thank them.

Set Up Clear, Accessible Community Guidelines

Your community guidelines essentially act as a blueprint for acceptable and positive behavior on your website or app. 

According to Adams, you should be direct in your community guidelines — tell your audience exactly what’s expected of their behavior on your digital properties. That includes outlining what kind of behavior isn’t appropriate. 

Adams suggests “your guidelines should make it clear that comments that appear legally objectionable or [encourage/condone] a criminal offense or any form of violence or harassment, will NOT be tolerated.”

Crafting clear community guidelines posted in an easy-to-access spot on your digital property will pave the way for acceptable behavior and positive conversations.

Media organizations must implement strategies to outline, reward, highlight and facilitate positive behavior in their online communities. By doing so, companies can benefit from a closely connected, active and growing audience that can be monetized continuously.

For more information on how to build positive behavior in your community, view a list of community-building best practices here.

Do You Have What It Takes To Build a Thriving Digital Community?

Every digital audience is flowing with revenue-generating power. Whether that power remains untapped or is harnessed to grow your company depends on how well you can transform your audience into a thriving digital community

Future plc, for example, grew its online audience by 56% in one year by nurturing its digital communities with worthwhile content and experiences.

But building and sustaining a profitable community online isn’t a simple walk in the park — it requires attention, effort and a carefully-crafted engagement strategy.

If you’re interested in securing loyal brand supporters and additional revenue-generation opportunities, we’ve created a checklist below that outlines everything you need to build a thriving digital community.

1. Social Tools for Your Owned and Operated Properties

2. Re-Engagement and Retention Techniques

3. A Process for Understanding Your Audience

4. Personalized User Experiences Based on First-Party Data

5. Comment Moderation

1. Social Tools for Your Owned and Operated Properties

You can’t form an active digital community if people don’t feel connected to your brand. For this reason, media companies must encourage their audience members to forge strong relationships with one another right on their websites or apps.  

Adopting conversation-based engagement tools like commenting widgets, live chat tools, and live blogs will allow your company to establish meaningful social connections between your audience members and brand.

A recent analysis of Viafoura data even revealed that people who interact with social tools online have a 20-40% higher retention rate after six months of visiting a site compared to those who do not.

2. Re-Engagement and Retention Techniques

With an abundance of media companies and services competing for your subscriber’s dollar, preventing churn is a constant struggle.

Take video streamers, for example. 

According to a survey conducted by Deloitte, consumers paid for around five streaming services in 2020. And yet, nearly half of the consumers surveyed canceled at least one of them that same year.

It’s important to have a strategy in place to re-engage your community members when their engagement levels begin to drop. That way, you can keep people away from your competitors by gently nudging their focus back toward your brand. 

Consider working with your engagement tool provider to find out when a user becomes unengaged. Once you identify your inactive subscribers or registrants, you can send out targeted offers and content to re-engage them.

3. A Process for Understanding Your Audience

As is true with any connection in the physical world, the relationship between your company and its community members shouldn’t be one-sided. After all, you can’t expect your audience to give you their loyalty, data or money without getting something valuable in return. 

And how could you possibly know what your audience wants if you don’t collect their first-party data, monitor their comments and speak with them to discover their interests?

To fully understand your community members and meet their needs, you’ll also need to turn anonymous visitors into known, registered visitors.

In fact, Piano, a subscription service provider, reports that, on average, registered users are 10x more likely to convert than an anonymous visitor. 

Organizations that use a proper identity management system will have a clear, 360-degree view of audience members and their interests.

4. Personalized User Experiences Based on First-Party Data

Once you have a steady stream of first-party audience data coming in, you can draw actionable insights to personalize the on-site experience for your community members.

“This way, audience fragments become super-served niches and loyal viewers become VIP members — who will stick around and pay off in the long term,” explains Rande Price, research director at the Digital Context Next trade organization. 

It’s also worth noting that people are hungry for personalized experiences. 

According to a research expert on Statista, 90% of U.S. consumers perceive content personalization in marketing to be appealing.

Producing customized experiences around your audience’s behavior will, therefore, keep them coming back to your digital properties for relevant content time and time again.

5. Comment Moderation

Your social tools are critical for forming strong connections between your community members; however, not every internet user will leave positive and productive comments. And unfortunately, toxic comments can damage your digital community.

The Pew Research Center states that one in every ten people will abandon an online service if they see nearby offensive behavior.

Safeguard your brand’s integrity and keep your social spaces inviting by enforcing your community guidelines through an effective moderation system.

We would recommend selecting a moderation system that can immediately detect all 6.5 million variations of each word. It should also be able to evolve alongside your community and understand sentence context for maximum protection.

Whether your end goal is to achieve sustainable revenue growth or simply to serve your audience members better, your success depends on the state of your digital community. The more engaged and connected your community members are, the more valuable they’ll find your membership program or subscription package to be.

The truth is that anyone can create a thriving digital community. All it takes is connecting our business to the right engagement, data-collection and personalization strategies.

The Greatest Challenges in Media From 2020, Unpacked

For many, 2020 was a low point — especially with the pandemic, political turmoil and social injustices raging across the globe. These recent events have also sparked a set of ongoing business challenges within the media industry.

Thankfully, organizations are determined to stand strong and be a trusted resource for community members no matter what’s thrown their way. 

“After a year where everything was confusing, and the goalposts were always moving, the best we can do as [media] organizations is to be useful and supportive to our communities,” states Mandy Jenkins, general manager of The Compass Experiment at McClatchy.

So to help media companies become more resilient and build better relationships with their audiences and staff, we unpacked crucial takeaways from some of the greatest industry challenges of 2020. Organizations that keep these takeaways in mind will set themselves up for long-term growth and success.

Limitations on In-Person Experiences Reinforces the Need for Digital Social Experiences

The pandemic forced a large number of print media products, including newspapers and magazines, as well as in-person events to shut down practically overnight. 

While media companies are now unable to build relationships with audiences in person due to safety restrictions, brand relationships are thriving virtually. 

“This year, the sense of isolation caused by lockdown has pushed a lot of people toward online communities to fill the void left by the lack of social interactions,” states Francesco Zaffarano, the editor-in-chief of Will Media. “Although that isolation will eventually end, engaging with communities will still be the key to success in the post-pandemic world.”

By delighting audiences with online social experiences, media companies can encourage connections to form around their brands. These brand relationships will then lead to greater reader loyalty and digital revenue.

The Explosion of Misinformation Calls for Moderation

Misinformation and fake news have been circulating online for as long as the internet has existed. However, the monumental events from 2020 have amplified the reach and impact of misinformation, destroying trust and endangering safety.

A recent study of 200 million pandemic-specific social media posts even revealed that 40% of them were unreliable. 

In a world where people no longer know what information to trust, providing reliable news and building close relationships with audiences must be a priority.

That’s why it has become vital for media organizations to invest in making their owned and operated properties safe, trusted spaces for news and related conversation. 

Consider producing a trusted environment by tightening moderation on your website’s (or app’s) social spaces to prevent offensive behavior and misinformation. 

As Anna Nirmala, VP of the American Journalism Project, stresses, “having a relevant and trusted brand is linked to building relationships and engaging with the community.”

The Loss of Third-Party Cookies Means a Shift to First-Party Data

Despite shrinking company budgets and the global pandemic, 2020 threw another curveball to media companies: the end to third-party cookies

Most media leaders quickly realized that they would have to reconsider their audience data-collection strategies to survive beyond 2022, when Chrome phases out third-party cookies entirely.

Little by little, organizations are shifting focus from third-party to first-party data strategies to future-proof their businesses. After all, first-party data offers insight into what audiences find interesting and how companies can better meet their needs. 

“You have to triple down on data – not in the crude sense of chasing page views, but in the sense of infusing the [organization] with a visceral sense of who audiences are, why you matter and how you can matter more,” explains Lucy Kueng, a professor and senior research fellow at the Reuters Institute.

In other words, first-party audience data is essential for creating highly relevant content and experiences to boost the appeal of your company’s services.

Changing Work Environments Put Greater Emphasis on Improving Mental Health in the Newsroom

In recent years, the state of employee mental health at media companies has been under scrutiny. 2020 then unleashed a mass migration to remote workspaces along with an alarming number of job cuts across the media industry, adding new pressure on newsroom workers. 

“We are at an interesting point where newsroom cultures are changing very quickly and the pandemic has accelerated that,” states Reuters Global Managing Editor Simon Robinson. “The challenge now is to keep that momentum going as new remote workers are joining news [organizations] and relationship-building is getting harder.”

Business leaders must now take steps to improve mental health in the newsroom to maintain a positive, productive work environment with satisfied employees.

All throughout 2020, media companies were forced to tackle one challenge after another. Fortunately, we’ve finally entered the beginning of a healing period, where businesses can learn from the past to become more resilient and profitable moving forward.

What the End of Parler Means for Media Companies

Ready your content moderators because 2.3 million active users many of whom are eager to encourage violence, racism, antisemitism, antifeminism and conspiracy theories have lost their home base on Parler. 

As a social platform that encourages free speech with practically no moderation or fact-checking, Parler has gained a massive user base of people with radical views.

At least that was until Apple and Google booted Parler from their app stores in response to how it was used to organize the January 6th attack on Capitol Hill. Even Amazon Web Services (AWS), which hosted Parler, has abandoned the company, pushing the platform mostly offline. 

“We’ve seen a steady increase in this violent content on [Parler’s] website, all of which violates our terms,” reads a letter that AWS sent to Parler’s chief policy officer. “It’s clear that Parler does not have an effective process to comply with the AWS terms of service.”

Though a bare-bones version of Parler has recently popped up on a Russian-hosted site, the platform will likely continue to be banned from app stores on mobile devices, which account for most of their users.

With Parler practically scrubbed from the internet, its extreme users will be searching for other media platforms they can use to amplify their radical perspectives. Digital media companies and online publishers will need to prepare for a possible frenzy of visitors with loud, destructive voices, who believe content moderation is a threat to free speech.

Leaving your digital properties vulnerable to these toxic commenters can scare away your loyal community members and damage positive conversations. 

Instead, here’s what you can do to prevent ex-Parler users, or any other radical and offensive voices, from wreaking havoc on your digital social spaces:

Make Sure You Have Clear, Easy-to-Access Community Guidelines

Sometimes we have a concept of what is or isn’t allowed in comment content. But creating a clear, unassailable description in your community guidelines can help prevent initial violations and give your moderators a reference point that clearly defines unacceptable content.
Examples of content to explicitly define as unacceptable include:

  • Personal attacks
  • Vulgar or obscene content
  • Libelous or defamatory statements
  • Anything that can be described as threatening, abusive, pornographic, profane, indecent or otherwise objectionable

Be sure to post your guidelines in a visible area of your website so that your digital visitors can access them with ease.

On-Site Engagement Actions

Not all registered users offer the same amount of value to media organizations. 

“Some users register to a website in order to use social tools, and others may register just to access content,” Liang explains. “Those who register to participate in a conversation — whether that be through comments, likes, replies or dislikes — contribute to a media company’s community with meaningful interactions.”

Have an Escalation Plan

In the case of an emergency — like the threat of an active shooter at your headquarters — your team must have a clear procedure in place. There are a few crucial questions you can ask your team to help them prepare for these types of threats: 

  • Is there a clear chain of command in an emergency? 
  • When do you alert the police versus the organization you’re protecting?

Distinguish between different types of non-urgent, semi-urgent, general and specific threats and outline how moderators should react to each of them.

Update Your Banned Word List/Moderation Algorithm

Did you know that users within a community can develop new phrases to spread offensive and dangerous messages?

This was the case for one publisher when Viafoura’s moderators noticed that trolls were posting a recurring phrase in community social spaces: “BOB crime.” Our moderators quickly realized that this phrase was being used in offensive contexts, and after investigating, found out that it stood for “Black-on-Black crime,” which challenges the Black Lives Matter movement.

The moderation algorithm was quickly adjusted to prevent relevant comments from being posted within that publisher’s community. However, this is just a single example of many where new phrases are created within a community to maneuver around basic moderation systems.

The bottom line is that language evolves. 

To reinforce community standards successfully, it’s essential that moderation algorithms and ban word lists are updated quickly as new, offensive language is discovered. 

Be Prepared to Block IP Addresses

In the digital world, the general belief is that the more eyeballs a piece of content can get, the better. The end goal for media executives is typically to gain and engage more site visitors to maximize subscriptions. However, visitor quantity isn’t always better than quality.

“Don’t be afraid to ban users,” says Leigh Adams, director of moderation solutions at Viafoura. “A lot of newspapers are afraid to ban users because they want the audience, but when you allow trolls and other toxic users to take over, you’re actually scaring away more valuable visitors.”

Fewer quality commenters offer more value to brands than many commenters that destroy the safety and trust between an organization and its loyal followers.

Ultimately, you are in control of your online community.

Just remind users in your community guidelines that you reserve the right to remove or edit comments and permanently block any user in violation of your terms and conditions. This umbrella statement gives you complete control over the content your community produces, guaranteeing discourse will remain positive and productive.

At the moment, we are living in a time of unpredictable change and misinformation. Whether or not any of Parler’s users make their way onto your website or app, it’s important to be prepared to handle and discourage any toxic behavior. Maintaining positive and productive social spaces will help to strengthen engagement around your brand while protecting its reputation. 

Need help identifying and stopping trolls? Check out our guide written by our head of moderation services on troll hunting.

Panel Discussion Breakdown: The New Rules of Moderation

2020 was undoubtedly the intersection of major health, political and social justice-related events. These life-changing crises activated wave after wave of misinformation and trolls, which, as many media professionals found, can damage the quality of human conversation within digital communities. 

Recently, executives from Editor and Publisher, USA Today, Graham Media Group and Viafoura gathered together to address these concerns in a recent panel discussion on the new rules of moderation. 

“Moderation’s been a big topic this year,” says Mike Blinder, the publisher of Editor and Publisher Magazine. “People are spouting off at their dinner tables, they’re spouting off at their mobile phones and they’re obviously spouting off on [media] platforms but we need to go beyond this.”

Not only do media companies need to eliminate trolls and build trust with community members, but they also need to expand their loyal audiences and maximize revenue. 

Gear up with insights and best practices on content moderation from the panel discussion to keep your media organization’s social spaces brand-safe, productive and profitable.

Setting the Stage for Ideal, Productive Discussion

In a perfect world, digital social spaces would be filled with an endless stream of engaging comments coming from multiple voices. 

But not every comment is valuable, and not every commenting thread will thrive. At least not without the proper support.

It’s up to moderators to ensure that only positive comments are being surfaced. Meanwhile, media staff have the power to amplify engagement from community members. With an effective moderation system and community engagement strategy in place, media companies can begin building communities by facilitating on-topic, positive conversations. 

“Once you take care of moderation well, once you engage, once you create a space where people want to come and talk about whatever they want to talk about that’s where you’re doing your community-building,” explains Viafoura Director of Moderation Leigh Adams.

Community-Building Best Practices

Many media industry professionals still view commenting sections as spaces for toxic behavior and misinformation to take root. However, commenting spaces can be extremely useful community-building tools when managed properly.

“Just because you have a comment thread doesn’t mean you just have to hand it over to your audience and let them do whatever they want on it,” states Dustin Block, audience development lead at Graham Media. “You get to make decisions of what you’re going to allow people to share, particularly around your stories.”

Review some essential best practices for using social tools from the panel discussion to begin refining your community-building strategy:

  • Don’t moderate your own platform’s comments so you can free up time for your staff to focus on creating valuable interactions with visitors.
  • Tighten community guidelines to help audience members focus on producing brand-safe and on-topic conversations.
  • Leverage subject matter experts, including content producers, to answer questions and encourage positive discussion from audience members.
  • Anonymize names of commenters to prevent women, minority groups and people with unique names from getting harassed. 
  • Invite readers to participate in the content production process so they feel heard and valued.
  • Invest in building audiences on your owned and operated properties instead of social media, where you have little control over data, audience relationships and revenue. 
  • Correct misinformation on your digital properties whenever possible to position your brand as a trustworthy resource. 
  • Embed comment sections around content that is likely to lead to productive social exchanges. 
  • Elevate model behavior in the community by highlighting positive comments, rewarding top commenters with badges, and asking specific questions you’d like community members to answer.
  • Encourage participation in conversations by adding additional, exclusive story details in comment sections.

The Value of Moderated Comments

Comment spaces backed by an efficient moderation system can unleash multiple benefits for media companies.

“For every time someone posts a comment, you might have 50 people reading it… that’s where the value is,” Adams highlights. 

Civil discussions can entice visitors to stay on pages longer as they read the comments, which increases the likelihood that they’ll register to interact on your website or app. That includes watching or clicking on advertisements. 

The bottom line is that media companies can build stronger relationships with their visitors through moderated commenting tools, resulting in more behavioral data and increased revenue.  

According to Michelle Malatais, the managing editor of consumer news at USA Today, “if we make it a worthwhile experience, and we can we have to put staffing toward it and we have to put attention toward it then there’s value.”

For more information, you can access the complete recording of the discussion panel here.

Moderation in a Time of Unprecedented Change Panel Discussion

On August 20th, Mike Blinder (Owner & Publisher, Editor and Publisher Magazine) was joined in conversation by Leigh Adams (Director of Moderation Services, Viafoura) and Dustin Block (Audience Development Lead, Graham Media) to discuss the importance of moderation in an increasingly online world.

A Changing Landscape

Our three experts shared dialogue and exchanged stories over how the move into a more digital world, where millions of people are working from home and looking online for news has elevated the importance of community engagement and moderation.

“[We’ve] moved into a digital world, Covid brought millions of people home either through unemployment or job loss or through remote work…people are online more than ever before..forcing these important conversations to happen in this digital space” – Leigh Adams, Director Moderation Services, Viafoura.

 

With hot button issues such as coronavirus, social injustices, and political tension all taking over the zeitgeist, it sets the perfect storm for trolls to spread misinformation and cause unrest.

 

“It makes a perfect storm for trolls because this is exactly the kind of inflammatory conversations they want to be involved in. They want to prey on the misinformed, they want to prey on people’s insecurities – make them more afraid. And media brands simply can’t afford to walk away from the comments anymore.” – Leigh Adams

 

Dealing with these digital trolls has become an essential part of any online publication. Media brands can no longer afford to ignore these issues as it isn’t uncommon for advertisers to walk away due to not wanting to be associated with the content found within a comments section. It’s also been shown that having a more welcoming, engaging comment space is more likely to grow your loyal audience. These loyal users are looking for a place to engage with others and are far more likely to return, watch videos, or click on advertisements.

 

“From an audience development perspective… that loyal audience, these registered users, they vastly outperform our anonymous audience. I spend a lot of time creating features for this registered audience. They watch video more often, they click ads more often, the come back to site the more often…and one of the main features that this group does is comment” – Dustin Block, Audience Development Lead, Graham Media

Making Your Digital Properties Brand Safe

The panel agreed that artificial intelligence moderation can cut a lot of spam and obviously toxic comments down, however, should be supplemented with some human intervention. People are clever and can find ways around scripted filters.

 

“It’ a marvel at how clever people can be about getting some sort of offensive or off-topic remark to be included” – Dustin Block, Graham Media

 

Instead, the best practice is to leverage some sort of automation, to cut down the volume of comments, in conjunction with a moderation team who are following externalized guidelines that remove the risk of bias while creating a safe space where anyone would feel comfortable engaging with others.

 

“If you’ve externalized your brand guidelines, you’ve said this is what we want our comments to be, these are the type of comments that we do want, these are the type of comments we do not want. And you’ve codified them and shared them in a way that’s outside the journalist’s hands, I think that’s going to make for much more effective moderation.” – Leigh Adams, Viafoura

 

Media companies also shouldn’t fear disciplining their users. For a long time, there’s been a fear that removing comments or banning users would hurt your audience members. The truth however is that someone who leaves hundreds of disruptive, antagonistic comments isn’t going to be a valuable member of your community and is more likely to turn others away from engaging themselves. There’s no freedom of speech within the private sphere, and that includes web forums. Those who aren’t following your brand’s community guidelines should be swiftly dealt with.

That being said, it is important to not penalize the misinformed. Let natural conversations happen, a healthy discussion or debate in the comments section can help expose people to new information. The healthiest online communities are the ones where loyal users respond to the potential trolls and challenge their claims with facts and an open dialogue.

 

“I work with our Trust Index which is a team that works on misinformation. Responding with links to articles to galvanize the community to counter bad information. – Dustin, Graham Media

 

One of the challenges of human moderation that the panel discussed was constantly learning new language. It’s important to know what people are talking about, the common terms being used, and what each acronym means.

 

“We’re applying [these learnings] across all of our clients and we’re making sure that’s [being shared out] not just the brand we’re noticing it on but all of our teams” -Leigh

 

Moderators should be trained not only on language but also on elevating the positive rather than just removing the negative. Viafoura’s “Editor’s Picks” tool of highlighting particularly excellent comments was mentioned as an example of how to reward positive contributions from your community. It’s also important that moderators and journalists who are responding to comments know the boundaries of their community guidelines and can keep an informative tone and avoid sarcasm or dismissive language while responding.

Making Journalism Better With User Generated Content

Part of the public service job of journalism is to educate people, and community moderation fits into that bubble as well. While it can at times feel exhausting reading hate speech in the comments section, it was pointed out that most moderators realise they’re making the digital world better for others with this work. If a moderator has to read a hateful comment so that hundreds of others don’t, it helps protect marginalized people from being hurt further and find comfort in a safer online space.

Lastly, the panel was eager to point out how good moderation can grow your community and ultimately help journalists find better stories. One excellent example was given of a commenter who pointed out that the cars involved in a road accident were actually $250,000 luxury vehicles which put a whole new spin on the story.

 

“I love comments for many reasons one: it’s just a place to talk to your audience. And they’ll share so much through those comments…you can make your reporting better, and you can better inform your audience just by engaging.”  Dustin Block, Graham Media

 

In general, there’s a higher quality of discussion to be found on your own website over social platforms like Facebook and Instagram. Companies should look to stop relying on these larger platforms for engagement and instead look to bring users into their own community and give them a positive experience where they’re more likely to subscribe, click ads, and interact with others.

Engagement is the future of journalism. Gone are the days of reporters simply talking at their audience, and now an open dialogue between the organization and its community is the expected norm. In this unprecedented shift into an increasingly online landscape, quality moderation is essential for those looking to grow their communities and become a landing spot for those looking to discuss, analyze, and share online.

Attention Publishers: There’s More to Moderation Than Toxicity

Toxic content like spam, misinformation and posts from trolls can be damaging to media companies for a host of reasons. As a result, publishers are gradually beginning to recognize the importance of moderating their digital properties. 

Google even has an API that’s now being widely used by moderation providers to assess toxic content. 

But here’s the problem: just assessing a platform for toxicity isn’t enough. Not when over 40% of people claim they’ve directly experienced online harassment.

Detecting incivility in your digital community is undoubtedly a necessary step in the right direction. However, every publisher that hopes to have a civil and profitable online community requires a moderation system that can also accomplish the following tasks:

Reinforce Community Guidelines

Truly effective moderation systems should be trained to support a media company’s community guidelines. There are different kinds of communities, after all. While some are designed to spark heated debates, like in sports or gaming, others are geared toward setting a peaceful environment. 

Be sure to check whether or not your company’s automatic moderation platform can mold itself around the nuances of your community. Because what works for one media company may not work for yours. 

“Rather than moderating the notion of toxicity, you can moderate around the guidelines that the publisher has actually set for their communities,” says Dan Seaman, VP of product at Viafoura. 

At Viafoura, our moderation experts take an existing algorithm that best represents a publisher’s audience and then adapts it to fit their community standards.

Detect All Offensive Words, No Matter Their Form

Trolls are intelligent and will do everything in their power to outsmart a moderation system. While some may write offensive words with spaces between the letters, others may disguise their words with numbers or symbols. 

Like Google’s API, most moderation systems focus on finding common patterns in toxic posts rather than the variations of jumbled words. Google even advises against using its API for automated moderation.

“The problem with using a basic toxicity rating is that it isn’t going to detect specific terminology,” Seaman explains. “If you can obfuscate words efficiently, you can get around toxic ratings.”

And each word can be obfuscated 6.5 million times. So no matter what automatic moderation system you use, make sure it’s capable of understanding the 6.5 million variations of each word. 

Publishers with proper moderation systems in place experience thriving communities, resulting in 62% more likes and 35% more comments from users. 

At the end of the day, analyzing root words in user comments can make the difference between a successful and unsuccessful moderation system.

Manage Evolving Language in a Community

Using a general toxicity rating or detection system isn’t effective enough to enforce civil conversation within each unique community. Especially not when the trolls within a community begin developing new ways to spread offensive messages.

This was the case for one publisher when Viafoura’s moderators noticed that trolls were posting a recurring phrase in community social spaces: “BOB crime.” Our moderators quickly realized that this phrase was being used in offensive contexts, and after investigating, found out that it stood for “Black-on-Black crime,” which challenges the Black Lives Matter movement.

The moderation algorithm was quickly adjusted to prevent relevant comments from being posted within that publisher’s community. However, this is just a single example of many where new phrases are created within a community to maneuver around basic moderation systems.

The bottom line is that language evolves. 

Companies can reinforce their community guidelines by ensuring their moderation strategies can detect toxicity as language evolves. To reinforce community standards successfully, it’s also essential that algorithms are updated quickly as new, offensive language is discovered.

Unfortunately, not all moderation companies can provide this service successfully. This is because they focus mainly on disabling patterns or character sets that are toxic not context or changing language.

To support a publisher’s online environment, moderation must go beyond addressing toxicity.

Although assessing incivility is an essential part of moderation, the nuances of each community and word must be addressed, and guidelines need to be enforced. The overall health and engagement of your digital community depend on it.

All the Reasons Why Trolls Need to Be Moderated

This article was originally posted on INMA.

Not all internet users have good intentions. Through misinformation and harassment, there are countless trolls online that cause the quality of conversations and content to decay.

The destructive power of trolls has only been intensified by the ongoing pandemic.

“We are swimming in a cesspool of misinformation,” states Jevin west, a professor at the University of Washington’s Information School. “The pandemic likely makes it worse because increased levels of uncertainty creates the kinds of environments that trolls take advantage of.” 

Even on your company’s digital properties, far away from social media, trolls are completely toxic to the environment. 

But you shouldn’t have to choose between letting trolls run rampant on your company’s properties and getting rid of social tools altogether. After all, providing social experiences online for your audiences is essential to building a loyal community.

If you want to protect your company’s content and audience against trolls, moderation is your best line of defense. Simply ignoring trolls isn’t enough, and will run the following risks on your properties:

Harming Your Brand’s Reputation

Trying to position yourself as a trustworthy space for news or a prime destination for entertainment? 

No matter how interesting and reliable your content is, trolls have the power to ruin the way consumers see your brand. 

In fact, Pew Research Center reports that “one-in-ten (13%) say they have stopped using an online service after witnessing other users engage in harassing behaviors.”

This means that allowing trolls to exist on your digital properties causes consumers to perceive your content to be toxic and avoid your digital properties. 

Moderation can give your company the power to align comments with your community guidelines, which will help solidify your company’s position as a brand worth engaging with and supporting.

Ruining On-Site Engagement Tools and Events

When left unchecked, trolls can effortlessly turn your engagement tools and digital events into dangerous territory for visitors. 

According to Leigh Adams, product manager of moderation services at Viafoura, trolls quickly become the vocal minority, and can quickly overtake and drown out more relevant conversation.”

And this comes with some detrimental consequences. More specifically, it can lower the effectiveness of any on-site audience engagement features your company has put in place. Irrelevant conversations block your engagement tools from their primary purpose: to build meaningful audience connections around content that matters. 

While trolls manipulate and harass others for a multitude of reasons, they all have one thing in common: they demand attention. As a result, unless trolls are effectively dealt with, they can sabotage the overall consumer experience on your digital properties.

Setting a Bad Example for New Users

Allowing trolls to do whatever they want can attract a significant amount of hostile and offensive behavior from other users. 

Failing to deal with toxic behavior shows users that they can be as aggressive and obnoxious as they want without facing any ramifications. So without putting a trustworthy moderation solution in place, you’re practically inviting community members and visitors to showcase their bad behavior. 

What may be grotesquely cathartic at the individual level simultaneously blooms into a toxic form of expression that ultimately erodes collective good will,” explains Kent Bausman, a Sociology professor at Maryville University. 

In other words, consumers can’t be perfectly separated into trolls and good community members — it’s easy for anyone to become a troll when hidden behind a screen. 

Bausman also suggests that trolling can often behave like a contagion, infecting other people if you don’t immediately isolate it.

Breaking Apart Your Existing Engaged Audience

Over one-quarter of Americans avoid contributing to conversations online after seeing toxicity in digital social spaces. 

By letting even a single troll run wild, your active audience as well as new visitors will be discouraged from engaging with others around your content. 

This poses a major problem, though. The less your audience engages on your properties, the less they’ll rely on your site or app for day-to-day interactions and human connection.

Keep your audience loyal, comfortable and profitable by ensuring trolls aren’t wreaking havoc on your community. 

You don’t need to choose between interactive tools and a safe environment for your consumers — you simply need to hunt down trolls that threaten your social spaces. And it’s never too late to get started.

Exit mobile version