Media companies, like all brands, are looking to build recognition and trust by publishing user-generated content. However, publishing this content isn’t risk-free: organizations need to ensure that users aren’t publishing offensive or threatening posts on their websites or apps. This is where content moderation comes into the picture.
In today’s modern environment, organizations are doing everything possible to ensure that civility exists on their digital properties while, at the same time, promoting free speech and opinionated conversations. Many of them have implemented moderation solutions that use live moderators or run automated algorithms to solve this challenge.
The general population has also become aware of moderation — especially what it does and why it’s being used. So what happens when your moderation partner becomes more than another ordinary technology vendor?
Recently, a Viafoura customer and one of the largest publishers in the UK discovered why the Viafoura moderation team is so much more than a partner.
The publisher uses both the AI (Artificial Intelligence) and Live Moderation solutions from Viafoura. As the AI solution learns and enforces the community guidelines set forth by the publisher, 85% to 90% of all comments are easily moderated by its AI engine. The remaining “questionable comments” are sent to a live moderator for a judgment call.
Earlier this year, one of those comments was sent to a live moderator at Viafoura. A user made a threat to a nursery in the commenting section, which of course was flagged and sent to the moderation queue.
Instead of just blocking the comment and banning the user, Viafoura’s moderation employee contacted the publisher’s team to explain the situation. The employees at the large publishing company immediately addressed the situation with local law enforcement.
In under half an hour of the comment being posted, the police took action.
Thanks to the quick thinking of Viafoura and the publisher’s employees, who went the extra mile, a potentially terrible situation was entirely avoided.
“Moderation is much more than a judgment call of ensuring user-generated content upholds platform-specific guidelines and rules to establish the suitability of the content for publishing,” says Leigh Adams, director of moderation services at Viafoura. “Yes, we are all about maintaining our customers’ standards, but it’s also about recognizing when a comment has to be escalated. We pride ourselves on having staff that know what to do when — and go the extra mile to reach out to our customers — because we have the relationship to do that easily.”