post-11858

No Matter Who’s In Charge, Communities Need Rules

ModSquad

The old saying goes, “Rules were made to be broken.” As far as we digital managers are concerned, “Rules were made to ensure the health and longevity of communities.” It doesn’t roll off the tongue as easily, but it certainly speaks to the true value of guidelines in building and sustaining digital community spaces.

Establishing and enforcing rules for how members of a community, whether real-world or digital, should conduct themselves are essential to maintaining order and keeping people happy and safe. In many cases, these communities are “official” — a forum created by a video game publisher, or a brand’s Facebook page.

User-run communities on Reddit

Thanks in large part to social media, practically anyone can create a community space for others to join. Reddit is a prime example of this, with hundreds of thousands of subreddit communities, all created by users (aside from a few subreddits used to share official Reddit-related announcements). The people who create subreddits don’t necessarily have any professional background in community management, and while subreddits run the gamut in terms of topics, one common thread runs through them: They each have rules for what is and is not acceptable in terms of content and behavior.

Take /r/crochet, a subreddit where crocheters share their work, discuss projects, and ask for advice. Its rules focus on maintaining a friendly community, keeping visitors of all ages safe, and protecting pattern creators/sellers. While the /r/crochet subreddit is not prone to disruption or heated arguments, the rules give the community and its moderators peace of mind.adviceanimals_rules

/r/AdviceAnimals, on the other hand, boasts over four million subscribers who share an appreciation for funny memes. With thousands of people visiting /r/AdviceAnimals at any given moment, and hundreds of new posts every day, this subreddit’s rules focus primarily on two things: guiding user behavior (particularly toward one another) and post content.

To help them comb through the activity on their subreddit, any moderator can leverage the services of AutoModerator, a bot that can be customized to identify certain types of content and automatically perform an action, like replying to or removing it. As seen in the countless comments posted by AutoModerator, many subreddits use this service as a way to remind users of the rules and provide an easy way to reach out to the subreddit’s human moderators.

User-run Facebook Groups

facebook_groups_createThere are millions of active Facebook groups of all sizes and purposes, and like subreddits, these are created by everyday users. The ones that see regular activity and keep users coming back have rules in place to guide discussions and behavior.

As an example, the Reactive Dogs group has gathered over 12,500 members since it was established in October 2012 by dog trainer and enthusiast Anne Springer. It’s up to Anne’s team of only five group admins (including herself) to keep discussions respectful and on topic. The group’s rules, and the way they are disseminated, play an essential role in maintaining a supportive atmosphere.

Anne told us that she set up the group’s rules right from the start and has tweaked them several times since, to better suit the group as it has evolved. To ensure visibility of the rules, Anne pinned a post to the top of the group that explains the rules through text and video.

In the case that a post or comment violates the rules in place, Anne and her team are there to moderate, disabling a comment thread from continuing or removing content entirely. We asked Annie how her small team handles such a growing and active group:

It’s not easy, but it’s a labor of love. [The admins] are great at sharing responsibility, and I support their autonomy very strongly. There’s not a lot of necessity for hand-holding.

A large contributor to the group’s supportive environment is the way Anne and her team moderate, taking the extra step to explain why a discussion has been closed or removed. This educational step is something our Mods understand very well, as this is a best practice for moderation. Not only does it help change the perpetrator’s behavior in the future, it also helps the community self-moderate, respectfully informing other members when they’ve violated the group’s rules. Anne’s philosophy is the same:

In a group of more than 12,000 members, we have members all the way from rank beginners to veterinarian behaviorists, and we hope we are speaking to all our audiences effectively. Do people get miffed occasionally? Yes, but it’s so much less likely if people observe the rules and perhaps lurk a bit before posting (a good idea in any social media setting, I think).

User-run communities don’t run free, and neither should branded ones

Founders and moderators of subreddits, Facebook groups, and other user-run digital communities are essentially volunteers. Their personal passion for the topic or community fuel a desire to dedicate their own free time to helping maintain the space. For many of these communities, volunteer moderation is all that’s needed. That said, volunteer moderation is not always suitable, especially when it comes to official or branded communities.

Because of their personal connection to the topic at the heart of a community, volunteer moderators sometimes let personal bias or opinion play a role in their decisions. Additionally, volunteers may not be tuned in at all the times you need them; you can’t guarantee a volunteer will be ready to handle an emergency that springs up at 3 a.m. This can be very risky for official communities, especially ones with a diverse, global user base; these situations call for hiring experienced moderators. While it’ll cost you more than free volunteers, professional moderation allows for unbiased monitoring and actioning of content, availability when you want it, and unwavering dedication to the health of your community. And when you hire the right moderators, you don’t have to give up the passion you’d find in volunteers! Ideally, you’ll want to staff projects with moderators who are as enthusiastic about your brand as you are.

Rules and moderators aren’t there to restrict users, but rather promote meaningful discussion and allow communities to remain strong and safe.

 

rules” by Steve Johnson is licensed under CC BY 2.0.