Q&A: Best Practices in Social Media Community Moderation - Social Media Explorer
Q&A: Best Practices in Social Media Community Moderation
Q&A: Best Practices in Social Media Community Moderation
by

Social Media Explorer’s Adam Torkildson and StockTwits COO Justin Paterno discuss flaws in community moderation practices and what social networks need to do to drive transparency and regain credibility. You can also view this interview on StockTwits’ blog.

Q: What is community moderation, and why is it important for social media?

StockTwits COO Justin Paterno

A: Community moderation is the curation of content posted on social media; curation that is ideally performed by people, not software alone. It’s an extremely important part of the management and growth of an online community. Essentially, community moderation is the glue that holds the community together and prevents anarchy. Plus, it ensures real people are participating on the platform – not robots with phony profiles.

Without enough quality moderation, there is no way for a community to evolve and improve as it grows. With that said, moderation that is too heavy-handed can turn into a total dictatorship with no room for creative expression.

The key to building and growing an online community or social network is finding a balance between too little moderation and too much. When social networks are moderated well, it ensures quality, truthful discussions without spamming or trolling.

Q: Twitter and Facebook have been under fire as of late for their moderation practices. What are they doing wrong?

A: Twitter by design doesn’t lend itself to moderation. It’s essentially a giant chat room that, as it scales, gets noisier and more chaotic. To scale nicely as more users sign up, the company would have to continually hire more people to moderate – but that’s not what’s happening. As a result, a lot of ads, misinformation and hate speech passes through. Overall, Twitter can be a negative experience and foster a very combative user base. Because it’s like a giant chat room by design, moderation on Twitter has a way of pitting the company against the community, making it very difficult to moderate.

Facebook is the first media platform to mix advertisements and user-generated content in the same place, on the same feed. Whereas TV separates ads and content, Facebook does not. It’s become the most lucrative ad platform in the world. Thus, native advertising can mislead and appear as regular content with little to no designation as such. This makes it hard to differentiate between fact and fiction. Facebook has really lost a lot, including quality and credibility, just to grow its user base. Better moderation, including separating user content from ads, could make a big difference.

Q: How do these practices impact the user experience?

A: Facebook, Twitter and other social networks should optimize towards truth and quality, not just engagement. Their algorithms don’t scale well to create a safe, truthful community or stream of content. And because there’s not enough careful moderation, users get besieged with everything from ads and fake news to posts from fake accounts and outright hate speech. Instead of really engaging with other users, they’re being lured into clicking on things, compromising the whole experience. What they’re getting is not genuine, and it’s not really “social” in the way that social networks are meant to be by nature. A lot of social networks don’t seem like communities at all, because of these problems.

Facebook and Twitter are starting to take steps to remedy the situation, but their current stance on truth is opaque and thus worrying. A network composed of lies cannot be successful in the long term.

Q: What are effective moderation tips that all social media networks should follow?

A: Now that we’ve all seen what happens when social networks experience unchecked growth without good moderation, it’s time to rewrite the playbook for how to create and scale a healthy online community. A newly launched social network should set and enforce general, common-sense rules at the beginning. This is the way to dictate the norms for the first 1,000 users.

But what works for 1,000 users might not work well for a million users or more. Companies need to be ready to adapt their moderation practices as the network grows. It helps to keep core values top of mind always. As it grows, social networks must remember where they came from, and what ideas really drove the creation of this network in the first place.

For example, we at StockTwits respond to our growing network by refining moderation practices and offering users enhanced tools to help them connect with other investors in the community. We recently did this with Rooms, a tool that empowers users to find and engage with like-minded investors by creating and curating their own chat rooms.  

Although moderation becomes more complex as social networks grow, there’s one simple principle to follow: there’s no substitute for the human touch. Artificial intelligence and other technologies can go a long way toward good moderation, but it really takes painstaking work by committed people to ensure that online communities remain a positive experience for users.

Social networks need to prioritize the safety and well-being of their community over profits and unchecked growth. This means hiring more people to moderate content as the network adds users, and it can also mean empowering users to moderate their own communities.

Moderation is something that requires a lot of time and effort, and ideally involves a lot of people too. It’s worth all that time and effort, however, because community moderation is the key to offering users an authentic and positive experience. Without it, it’s not a community at all.

SME Paid Under

About the Author

Adam
Adam is an owner at Nanohydr8. He really loves comedy and satire, and the written word in general.

VIP Explorer’s Club

Categories

Archives