New Study Finds Publisher-Posted First Comment Steers Engaging Conversation

Audience engagement managers know that a healthy comment section is a core component to building a healthy online community. With contentious political debates happening with greater frequency and more people turning to the internet to vent frustration, maintaining a civil yet lively  comment section has become more important than ever.

Thankfully, one simple technique can help publishers and moderators guide conversations with increased peace of mind. According to a new study from Viafoura, media outlets see a significant increase in traffic, civility and engagement when publishers post the first comment.

Posting the first comment has long been a technique used by social media managers and influencers in order to control the tone of the conversation and create engagement. The first comment operates as an ‘ice-breaker’ and helps set the standard for the conversation. It also invites responses from readers, which can increase time spent on site and enhance brand loyalty.


What happens when publishers post first? 

The study took place between September 28th and December 15th 2021 in 15 newsrooms across Canada covering a diverse selection of geographical regions and various ends of the political spectrum. Each newsroom posted the first comment on a number of articles across their publications, within the first hour of the articles going live. This test group was then compared to a baseline group in order to provide context to the results. 

One element of the study that differed from platform to platform was the type of comment posted beneath the article. Tactics varied from newsroom to newsroom. Some editors offered users assistance, responding to particular points in the article or answering questions. Others asked specific, directed, questions about readers’ response to the article. Writers had the freedom to bring their voice and creativity to the comment section!

Over the course of the study, Viafoura collected data on volume of comments, time spent commenting and conversion before and after engagement – three crucial metrics used  to evaluate the success of an online community space. 


Posting first means increased engagement, conversion & peace of mind

The study found that activity increased dramatically across all metrics, with a 45% increase in time in comments, 380% increase in total average comments and a 347% increase in average likes. This drastic increase indicates that controlling the first comment sets a tone and leads to more civil discourse. 

With a standard set of behavior in place, moderation needs decreased. While the baseline group was required to flag 6.8% of posts and disable 9.1% of users, the test group saw significantly reduced numbers, with 4.8% flagged and 7.4% disabled. Simply by modeling good behavior, publishers reduced the number of community guideline violations and user bans. 

Perhaps the most surprising result was the significant increase in registrations. 

Previous research has shown that almost 50% of members end up removing themselves from a platform when exposed to trolling. In contrast, when publishers posted the first comment, each article saw a 55% increase in registrations, with a 9% increase in users who attempted to interact with commenting before signing up.

The positive impact didn’t stop at conversion. Viafoura also saw a 21% increase in users who interacted with commenting after signup. This indicates that users participated in the discourse, felt positive enough to register, and then continued to feel engaged and loyal past the point of conversion. 

This data supports the notion that posting the first comment can be a significant step in reaching target conversion goals. When publishers interact directly with their community, they help maintain a sense of safety and attention that can lead to a direct increase in engagement.
 

What can we conclude?

Strong customer loyalty is essential for the wellness and longevity of any online brand. Simple moderation tricks can be the difference between a contentious online conversation and a  thriving online community. 

By setting the tone for conversation, publishers can direct content, invite polite discourse and even tailor their engagement to suit the needs and interests of their target audience. In each case, first comments have proven to be an essential step in the process of protecting and growing digital communities. 

Why Comment Moderation Vendors Need to go Above and Beyond to Protect Their Partners

Media companies, like all brands, are looking to build recognition and trust by publishing user-generated content. However, publishing this content isn’t risk-free: organizations need to ensure that users aren’t publishing offensive or threatening posts on their websites or apps. This is where content moderation comes into the picture.

In today’s modern environment, organizations are doing everything possible to ensure that civility exists on their digital properties while, at the same time, promoting free speech and opinionated conversations. Many of them have implemented moderation solutions that use live moderators or run automated algorithms to solve this challenge.

The general population has also become aware of moderation especially what it does and why it’s being used. So what happens when your moderation partner becomes more than another ordinary technology vendor?

Recently, a Viafoura customer and one of the largest publishers in the UK discovered why the Viafoura moderation team is so much more than a partner.

The publisher uses both the AI (Artificial Intelligence) and Live Moderation solutions from Viafoura. As the AI solution learns and enforces the community guidelines set forth by the publisher, 85% to 90% of all comments are easily moderated by its AI engine. The remaining “questionable comments” are sent to a live moderator for a judgment call.

Earlier this year, one of those comments was sent to a live moderator at Viafoura. A user made a threat to a nursery in the commenting section, which of course was flagged and sent to the moderation queue.

Instead of just blocking the comment and banning the user, Viafoura’s moderation employee contacted the publisher’s team to explain the situation. The employees at the large publishing company immediately addressed the situation with local law enforcement.

In under half an hour of the comment being posted, the police took action.

Thanks to the quick thinking of Viafoura and the publisher’s employees, who went the extra mile, a potentially terrible situation was entirely avoided.

“Moderation is much more than a judgment call of ensuring user-generated content upholds platform-specific guidelines and rules to establish the suitability of the content for publishing,” says Leigh Adams, director of moderation services at Viafoura. “Yes, we are all about maintaining our customers’ standards, but it’s also about recognizing when a comment has to be escalated. We pride ourselves on having staff that know what to do when and go the extra mile to reach out to our customers because we have the relationship to do that easily.”

Overcoming News Avoidance And Winning Back Your Audience

If nothing else, one positive element that emerged from the pandemic is a renewed focus on mental health and wellness. From one week to the next, people worldwide became shut-ins whether they wanted to or not. They were forced to sit at home and, after burning through all that Netflix had to offer, think. Think, reflect, and become aware of their mental health in ways that had perhaps been easier to avoid in the before times. 

With this time for reflection, it’s no wonder people began to notice the correlation between their moods and mental health and the non-stop emotional rollercoaster of the news cycle throughout the pandemic. In one sitting viewers would be subject to an inspiring video of Italians singing from their balconies in quarantine, followed by horrifying stories of people trapped in their homes with deceased loved ones – all while a chiron at the bottom of the screen provided an ever-updating death counter. 

While the news cycle is not known for being a constant source of uplifting content, the pandemic brought to light the impact that bad news has on our mental well being. It’s no wonder that new audience behaviors emerged. Ones that, to the detriment of publishers everywhere, would have us sooner look away and avoid the news than tune in to have our days ruined by yet another article about the latest existential threat. 

Mental health effect on news avoidance trends

News avoidance: the active or intentional resistance or rejection of news

Though still in the early days of this new behaviour, studies have indicated that people the world over have become more selective of the content they consume. It is a means of mitigating the negative feelings that come hand in hand with a news cycle that seems to skew ever more negative, concerning, and depressing.

In the early days of the pandemic, according to data compiled by Nielsen, publishers tracked a 60% increase in news content consumption, globally. What were the headlines during that period? Stories related to the pandemic, as well as political crises occurring around the world, with more than a few notable mentions belonging to the United States.

As time went on and the headlines became ever more tragic, an overwhelming sense of burnout amongst audiences was being fueled by the news. In an annual Reuters survey of over 90,000 participants in 46 different markets, it was found that 43% of people said the non-stop barrage of COVID-19 or political news triggered their decisions to embrace selective news avoidance. Additionally, 36% of those same respondents said their moods were negatively affected by all the predominantly depressing nature of the news cycle.

Publishers have since then have found themselves in an impossible position: report honestly on the grim nature of our world’s current events and suffer decreased views, report sensationally and lose credibility, or report on benign topics like celebrity divorces and scandals to keep people entertained but uninformed?

Negativity crushes trust, increasing news avoidance

Not only a tricky situation for editorial and content teams, news avoidance has also made it difficult to build communities of passionate and engaged followers. It’s even more difficult when the news itself is deemed untrustworthy by misguided or misinformed consumers. The United States, in particular, has to grip with this growing trend. Only one quarter of US respondents say they trust their nation’s news media.

Audiences will always have thoughts and opinions, particularly when it comes to larger than life concepts like the spread of a pandemic or an insurrection to overthrow democracy. It’s natural to want to share those thoughts and open up a discussion about those ideas, something that the comment section of an article is quite literally made for.

However, nearly one out of five respondents to the Reuters study said they skew towards news avoidance because sharing their opinions lead to arguments they’d rather avoid. This goes right to the heart of the challenge that publishers face as they attempt to come up with solutions for their waning engagement and subscription rates. If people don’t feel comfortable expressing their viewpoints, not only will they avoid engaging in open discourse around enticing subject matter, it’s likely that they will avoid the content altogether.

How to overcome news avoidance and win over audiences

So what can publishers do to overcome news avoidance and build thriving communities of passionate readers? Answer: an audience-first, data informed growth strategy.

By putting the interests of your audience first, creating content that aligns with your orgnizations values, and the goals of your editorial and publishing teams – you’re in good shape to start diminishing the risk of news avoidance. If you’re able to position yourself as a publisher who delivers high-quality content and makes space for community and healthy discourse, you’re on track to winning back your audience and gaining access to valuable first-party data that will further inform your efforts.

Behavioural insights are essential in the current digital publishing landscape. That data can be difficult to acquire without an analytics team, but turn-key solutions do exist.

Shadow banning against community violators

Platforms built by moderators to help other moderators maintain a positive community are available to you and your teams.

One valuable tool for community moderation is time-based shadow banning. These “timeouts” can be handed out to people who frequently disobey community guidelines and spread toxicity. 

Labeling comments can help reinforce those guidelines further, highlighting ones that are aligned with guidelines, ones that are veering off topic with more random postings, and even flagged as outright attacks on authors or other community members. Through careful and considerate moderation you’ll be better able to promote cooperative and respectful dialogue among readers. By making the space for discussion safer, you’ve created an inviting opportunity to potential users who may have been avoiding your content as a means of dodging unwanted conflict and toxicity.

IP lookups to restrict or block suspected trolls

Publishers, obviously, need to grow their audiences to stay afloat. A healthy, sizeable viewership is essential to revenue, data informed learning opportunities, and not to mention extremely appealing to advertisers and affiliates eager to spend money to connect with those readers.

Unfortunately, if trolls or extremists harass other community members to the point of pushing them towards news avoidance, the quality of the viewership is greatly diminished. Quantity is not better than quality, even when views and shares are important metrics to help boost subscriptions.

Instead, you can use platforms with built-in IP address lookup capabilities to find these bad actors and moderate their posts so they can no longer disrupt the rest of the community. This will also help you avoid inadvertently violating publishing guidelines of your affiliates and risk losing vital business, which was a hard lesson learned by the people of Parler following January 6.

Moderate conversations, live events, community chats, and reviews

Finally, use your moderation console to encourage healthy dialogue across all digital streams affiliated with your publication. This can include conversations in the comments section of an article to interactions among live events and community chats. You can even influence the tone of ratings and reviews about your publication to stop misleading negativity from spreading.

The console plugs directly into each of these forums, allowing your entire editorial team to work out of the same space and enforce consistent guidelines across each outlet. Not only will this increase the efficiency and productivity of your team, but you’ll set a standard for your audience about what kind of community they can expect from your publication. This is how you set the stage to build trust and authenticity, two absolutely necessary traits to grow your audience.

While the world is ever-changing and readers adjust the way they consume content, publishers need to be mindful of how to create spaces that can be informative, safe and encouraging for their readers.

All the Different Types of Moderation for Your Digital Properties

Updated June 22, 2020.

With trolls and toxicity running wild all over the internet, moderation has become practically mandatory for any publisher hoping to build and engage a profitable community.

Nearly 50% of Americans who have experienced incivility online completely remove themselves from the situation. Which means, if you allow toxicity to go unchecked on your properties, half of your audience is likely to abandon your platform if they see anything offensive.

Not to mention that Google’s starting to ban media companies with toxic comments around their content from its Ads platform. ZeroHedge, for instance, was recently banned for allowing offensive and false information to exist on its website. The Federalist also received a warning that it would be banned for the same reason if protective actions aren’t taken. 

In a joint statement with other big tech companies, Google explained that it will be focusing on “helping millions of people stay connected while also jointly combating fraud and misinformation about [COVID-19], elevating authoritative content… around the world.” 

Google’s global move to tighten moderation restrictions on media content comes after several countries, like France and Australia, began challenging the tech giant’s dominance over media.

If you want to keep your environment protected, moderating user interactions will help your visitors feel safe enough to engage in conversations and build relationships on your platform.

But while many engagement tool vendors claim to use fancy moderation algorithms, most of them are nothing more than ban-word lists. To protect the environment on your platform so users actually want to return, you’ll need to go beyond a simple ban-word list.

Instead, sift through the different forms of moderation so you can select a tool that will best support your community guidelines.

From more traditional user self-policing and manual moderation, to fully automated and full-service approaches, we’ve broken down the different types of moderation to help you understand what will work best for your business.

User-to-User Moderation

You know how on social media, you have the option to flag or report anything that distresses you so moderators can review? Well that’s what user-to-user moderation is — users monitoring and reporting other users for bad behavior.

As an extra layer of protection, consider giving your community the power to flag offensive or concerning posts that may slip through the cracks. Facebook moderation has some flaws and gaps when it comes to moderation, which is why you need a complete, specialized platform like Viafoura.

To minimize the amount of user-to-user moderation needed on your property, it’s important to have strong community guidelines. Read the best practices for creating community guidelines here.

Human Moderation

Moderation is an extremely complex topic that can be broken down into different forms. But here’s the problem: many publishers think that human moderation where people manually go through users’ posts and block any offensive onesis the only kind of moderation.

Human moderation is incredibly time-consuming and expensive on its own. Publishers who can’t afford to hire external human moderators and don’t have the time for it in-house tend to give up on building an engaged community altogether.

Sound familiar?

Consider this: when paired with an automatic solution, the need for human moderation is minimized, reducing associated time and cost investments.

The great part about human moderation is that humans are able to catch harassment or incivility that can’t always be picked up by existing algorithms as they learn. So instead of having humans do all of the work, reduce their moderation scope to focus on what technology can’t catch or understand on its own. 

Read more about human vs machine moderation here: Human vs. Machine: The Moderation Wars

Automatic Moderation

Forget simple ban-word lists. A truly effective automated moderation solution should instantly prevent the majority of toxic posts from being posted from the very second a user submits a comment.

Intelligent automatic moderation lightens the workload for human moderators by minimizing the number of comments pushed to humans, allowing them to focus on truly questionable posts, or those that are flagged by other users.

Quick tip: Many service providers claim to have AI or automatic moderation, but don’t actually leverage natural language processing or machine learning to understand variations of words and sentence structures. Check in with your moderation provider to make sure your tool can learn as moderators approve or block comments, further training the algorithm, which should be customized to your guidelines. 

Intelligent Auto Moderation of Live Chat

Full-Service Moderation

What’s a publisher to do when they just don’t have enough time on their hands to worry about moderation?

Answer: Outsource the complete solution to a service provider.

Choose a vendor that can bundle a cost-effective package for you, which should include human, sophisticated automatic and user-to-user moderation services to get the best bang for your buck.

Remember, folks: a protected community is a happy community. And that translates to revenue growth.

Hungry for more knowledge on moderation? Check out seven tips that will help your moderation team survive a national election.

Viafoura Automated Moderation Changes the Game for Community Moderation

Don’t Sacrifice the Flowers for the Weeds Have you ever had the pleasure of digging through the comments that pollute the web? If you have…

Last updated June 14th, 2018

Don't sacrifice the flowers for the weeds

Have you ever had the pleasure of digging through the comments that pollute the web? If you have, then you are no stranger to the spam and hostility that overwhelm the comment boxes that are a huge effort for teams to manage.

While spamming and trolling are challenges faced by many organizations, top media companies and brands know that community is everything, and that it’s crucial to be able to listen to and engage with customers online in real time. Unfortunately, that means constantly sifting through the many hateful comments in order to nurture a healthy online community.

Community Growth

It’s not just frontline digital teams that want to foster a healthy online environment – it’s important to their audiences and customers as well. In fact, when the quality of conversations increases, so does their audience’s engagement.

35
Increase in comments per user
34
Increase in replies per user
62
Increase in likes per user
22
Increase in likes per comment

*Analyzed data gathered from 600+ media organizations, compiled both before and after the introduction of Viafoura Automated Moderation.

Cutting Through the Noise

With smart technologies like Viafoura Automated Moderation, content producers can manage, moderate and listen to their communities, with the protection of pre-moderation in real time.

Automated Moderation automatically eliminates up to 90% of the time and effort spent moderating communities, analyzing comments and responding to customers.

How does it work? Our team of linguists teamed up with our engineers to build an engine that automatically looks for patterns in language. It determines the topic, how the person felt when they wrote it, and also its context. They did this by programming every 6.5 million variation of each word in English, Spanish, Portuguese and French, with more on the horizon.

This engine is then used to moderate and listen across all owned and third-party social networks to manage engagement, provide insights into urgent customer complaints, and display data and insights in one dashboard. It immediately removes comments outside of your community guidelines and sends suspect comments to a queue for resolution in real time.
That means that community managers don’t need to spend their resources looking over each comment or manually monitoring social networks. When a moderator logs in, they can easily review what needs their attention, focusing quickly on issues that really matter. By cutting through the clutter and allowing the most important comments to get addressed in real time, it allows you to deliver the best customer experience.

Creating Meaningful Relationships

By flagging and removing inappropriate comments, Viafoura Automated Moderation allows authors, community managers and social media managers to spend their time addressing important inquiries quickly and creating meaningful conversations with their audiences.
And when your teams are empowered to engage with audiences in a timely and meaningful way, it leads to the best customer experience, higher engagement and ultimately a higher lifetime value for each customer.

Interested in learning more?

Connect with us today to learn how Viafoura can help you build, manage and monetize your audience.

Connect Now

CBC and The Weather Network Discuss Online Commenting

The Importance of Commenting from RTDNA 2017 Conference

In the RTDNA session, Commentary, Commenting and Diversifying Your Voices, our Head of Marketing, Allison Munro, moderated a conversation with news media executives from the Canadian Broadcasting Corporation (CBC) and The Weather Network (Pelmorex Media). The two panelists included Jack Nagler, the Director of Journalistic Public Accountability and Engagement at CBC, and Carrie Lysenko, the Head of Digital at Pelmorex Media. Their discussion explored the pros and cons of online commenting and how news media organizations can overcome the challenges.

How Important is Commenting in News Media?

For the Canadian Broadcasting Corporation (CBC), commenting is not just a value add; it’s critically important for their brand strategy. One of their goals is to provide Canadians with a place to explore their diverse opinions, and commenting supports this vision. Nagler states that commenting has helped them become a better newsroom because their readers improve the stories being told.

At The Weather Network, Lysenko stated that commenting is important because nature-enthusiasts want a forum to share their opinions, photos and videos. Lysenko also noted that when they turned off comments, there was a significant drop in pageviews and attention time.

This echoes our findings that brands with commenting can increase their pageviews by 248% and attention time by 364%. Researchers for the MIT Sloan Management Review also confirm that users’ willingness to pay for subscriptions increases with their growing online social activity.

“Only an engaged user will become a long-term subscriber.”
—Tobias Henning, GM of BILD

A majority of website visitors would also agree that website commenting is valuable. In a recent survey of their audience, CBC found that 70% of respondents said that comments were important to them. Furthermore, they saw that 70% of website visitors spend at least 15% of their time onsite just reading comments.

Using Comments to Create New Stories

CBC receives story tips and article corrections within their comment section from their loyal readers and watchers. Nagler asserts that audience contributions add a lot of value to their articles as they spur further discussions and stories.

He gave an example about an article on a wedding party that fell ill during their stay at a resort. After reading the story, another reader commented that she too got sick at the same place. From there, an investigative story was born, providing valuable information to other travellers.

CBC now takes their top comments and creates stories from them in the Revenge of the Comment Section. As these stories are made from comments, they offer a quick and cost-effective way for publishers to post new content.

Similarly, users share their photos and videos with The Weather Network, which drives further engagement and new content. Lysenko described when The Weather Network connected one of their website contributors to Canada Post to create an official stamp. After viewing the photo he submitted, they made arrangements to create the stamp and tracked his story on their website.

 

Three SEO Benefits of Online Commenting

User-generated content, such as comments, can be indexed by Google if it’s placed higher on the webpage. For example, editors can choose their favorite comments and place those quotes within the body of an article.

Furthermore, pages with active content updates, such as new comments, can trigger additional reindexing and improve the recency and relevance of the page in search results.

Your audience may also use keywords around a topic that differ from what journalists write, and can provide closer matches to search terms.

The Truth Behind Facebook Commenting

While your Facebook page may be a hotspot for online commenting, it can’t take the place of commenting on your website. And it’s not only because your direct website visitors are more loyal than your Facebook readers, but also because Facebook doesn’t give publishers all their first-party audience data from commenters. (Similarly, Facebook’s free commenting platform for websites also keeps your invaluable data.)

Both CBC and The Weather Network recognize that publishers should focus on getting readers to comment on their websites and collecting their audience data. That doesn’t mean Facebook or its tools shouldn’t be used at all; in fact, Social Login is an extremely valuable tool for news media websites.

When users are able to register for news websites through their social media account, this greatly reduces friction when signing up. It can even increase conversion rates by 20% to 40%. Lysenko adds that if you have the capability to import data from their social account into their user profile on your website, then you’re taking advantage of Facebook login without giving away your data.

“Direct visitors are more loyal than Facebook visitors.”
—Terri Walter, CMO of Chartbeat

Moderation is the #1 Challenge for Community Management

Both panelists say that the greatest challenge to commenting is moderating online discussions in real time. With so many trolls online, moderation is vital for publishers who want to provide a safe space for their users. And according to Engaging News Project, users’ interest in returning to a website almost doubles if they know the discussion will be civil.

CBC found difficulties with both pre-moderation and post-moderation. With the former method, moderators review comments before they get published. But this time-consuming task doesn’t allow for real-time discussions, which are so important for timely news and weather events. With the latter method, users are able to post comments without review, and inappropriate comments only get removed if they are flagged by the community and reviewed by a moderator. While this avenue is much less time-consuming, brands risk having content on their website that doesn’t align with their guidelines.

Like some media companies, CBC has even opted out of commenting altogether on certain stories that may trigger heated arguments. Similarly, The Weather Network chose to disable commenting on stories about climate change, finding too many undesirable comments between advocates and deniers.

Since then, The Weather Network has decided to employ automated moderation to manage their online communities. Automated moderation uses artificial intelligence to automatically detect and delete offensive comments. This allows conversations to unfold in real time while maintaining a brand’s community guidelines.

Human Moderation

81
Accuracy

Automated Moderation

92
Accuracy

They have also decided to offer self-moderation tools that allow users to personalize their online experience. These include the ability to mute other users and to dislike and flag comments.

Save Time and Resources with Automated Moderation

Website commenting has been an important feature for both the CBC and The Weather Network, helping them increase brand loyalty.

It’s also been invaluable to their audiences, who enjoy reading the comment section and sharing their content with others. However, many users get deterred from engaging on your website if the discussions aren’t civil and respectful.

Automated moderation is the latest solution to this problem, giving media brands a cost-effective way to moderate their communities. Media organizations have also shown that automated moderation drives further engagement, by increasing comments, likes and registered users, while significantly reducing flagging and the time and effort needed by moderators.

Interested in learning more about Automated Moderation?

Connect with us today to learn how Viafoura can help you build, manage and monetize your audience.

Connect Now

RTDNA 2017: Fake News, Trolls and Diverse Commenting

RTDNA 2017: Fake News, Trolls and Diverse Commenting


RTDNA 2017 Conference in Toronto

For news broadcasters, the story doesn’t end when it’s published or aired. It’s just the beginning for their audiences, who are looking to discuss their diverse opinions around a shared reality.

That’s just one idea that will be explored at the Radio Television Digital News Association (RTDNA) 2017 National Conference. Taking place from May 26 to 27 at the Sheraton Centre in Toronto, the conference offers a forum for open discussion on the issues that impact Canadian newsrooms. It’s also a great opportunity for career development and connecting with leaders in news media.

This year’s topics include:

  • Connecting with highly-skeptical and mistrusting audiences
  • Combating the fake news epidemic
  • The responsibility of journalists to reflect diversity in the newsroom
  • Encouraging audiences to constructively debate their diverse opinions
  • Knowing your audience and monetizing it
  • Using investigative journalism to grow audiences
  • The future of news radio

Encouraging Diverse Opinions with Civil Commenting

If you’re interested in driving audience engagement and civil comments on your website, we encourage you to attend Commentary, Commenting and Diversifying Your Voices on Friday, May 26 at 1:45 PM.

The panel discussion features leaders from Canada’s top news media organizations—Canadian Broadcasting Corporation (CBC), The Weather Network (Pelmorex Media), Global News and Corus Radio. Our very own Head of Marketing, Allison Munro, will also be there moderating the discussion.

Together, they will be exploring the best way to encourage diverse commentary in an age of trolling and online attacks. Attendees will learn the value of commenting for their journalistic approach, audience relations, and their bottom line. In addition, they will hear how these industry leaders moderate and protect their online communities without draining their resources.

“Canada is rich not only in the abundance of our resources and the magnificence of our land, but also in the diversity and the character of our people.”
The Will of a Nation: Awakening the Canadian Spirit—George Radwanski & Julia Luttrell

Not attending RTDNA? Don’t miss out on the learnings—download our white paper, How Audience Engagement Drives Retention, Loyalty and Revenue.

Download Guide
Exit mobile version