top of page

Effective Facebook Comment moderation: How to Facilitate Discussion While Reducing Harm

One of the consequences of growing your social media audience is that you’ll inevitably start to draw the attention of people who disagree with you, want to challenge your organization or audience, or simply want to use your content as an excuse to provoke. As your audience grows, that growing audience will inevitably include more trolls.


For many organizations in the progressive movement, oppositional and purposefully antagonistic online comments are a common occurrence. This opposition can range from mere annoyance to blatantly problematic. Some comments might warrant obvious immediate removal or deletion from your page, while others may seem innocuous.

When does having an opposing opinion cross the line into trolling? When does opposition become problematic, and comment moderation a matter of harm reduction? When does deletion become censorship?

For marginalized populations, some content might be viewed as trauma inducing, violent, or offensive, in a way that it might not be for non-marginalized populations.

The term “harm reduction” is often used in the progressive movement to describe the ways in which we go about making sure content is not causing psychological, emotional, physical, or intellectual harm to viewers, especially people with marginalized identities. How do we in the progressive movement decide how to engage with comments online that are oppositional, and where does censorship end and the moral responsibility to do right by the most marginalized in our community begin?


Deleting a problematic comment can be harm reduction. If a comment is clearly directed at a specific population in a negative, demeaning or bigoted, triggering, trauma-inducing, or violent manner — especially a marginalized community — deleting this comment is harm reduction. Instances that aren't a clear-cut should be considered on a case-by-case basis and in a way that is consistent with your organization's values.


Should you moderate that comment? Questions to consider:


1. Is it relevant or necessary? Sometimes it’s best moderation practice to let people engage with one another, especially if none of the comments are particularly problematic. Comments — even negative ones — give your content a boost in the Facebook news feed. If you want your work or perspective to be seen by more people, your bias should be toward not deleting comments.

An example of Facebook commenters engaging in a positive, policy-focused discussion.
A positive, relevant, policy-focused comment discussion

But if a comment is bigoted, violent, targeted, trauma-inducing, potentially triggering for survivors, or otherwise problematic, it is harm reduction to step in and moderate. If a comment is merely irritating or pugnacious, but doesn’t meet the standards listed above, it could be deemed unhelpful censorship to delete or “hide” the comment in question. If a comment is problematic and has nothing to do with the topic at hand, hide it.


2. Is it vulgar? Is it appropriate for your organization to allow swearing or vulgar terms on your page and in your comment thread? If a comment is not explicitly problematic or targeting a specific group, but uses swear words, is it appropriate for your audience? If not, you should consider automating some of your daily moderation by setting up a Profanity Filter. This Facebook feature allows you to implement a language filter, blocking specific words or phrases.


3. Could it cause harm to someone who reads it? For instance, seemingly off-topic comments of “Trump 2020” or “MAGA” may, for some marginalized populations, be potentially distressing, harmful, offensive, or triggering, given the racist, xenophobic, homophobic, ableist rhetoric tied to these phrases. Is it appropriate for commenters on your page to evoke tangentially problematic imagery, phrases, or public figures? If your original post is a positive graphic about organized workers winning a raise, “MAGA” as a comment is irrelevant, doesn’t add value to the conversation, and isn’t meant to do anything but provoke. Hide it.


Deleting vs. hiding comments

For especially egregious comments, deleting might be the best option; this erases the comment completely so that no one can view it. “Hiding” a comment prevents the comments from being seen by most page viewers, but still allows the comment to be shown to the Facebook user who posted it and their friends, making it is less likely to cause escalation and charges of “censorship”. Deleting vs. hiding a comment is something that should be considered on a case-by-case basis. Remember: hiding a comment means the commenter and their Facebook friends can still view the comment, and to the author, the comment will look unmoderated.


Instilling community standards

Depending on the needs and audience of your social media platform, it might be a good idea to establish online community guidelines; these are guiding principles by which every community participant must abide. Guidelines can be as simple as, “If we are forced to delete your comments three times for offensive language or content at odds with our organization’s mission, your account will be blocked from viewing and engaging with this page." Community guidelines or standards can be included in the “general information” section of your Facebook page. When someone breaks the rules, don’t waste more of your time moderating their comments than you need to.


From a harm-reduction standpoint, it is best to err on the side of caution. Gauging the need for harm reduction requires nuance and can often be contextual. If a comment is bigoted, violent, targeted, or otherwise problematic, it is harm reduction to step in and moderate. If a comment is merely pugnacious, but doesn’t threaten, use bigoted language, induce trauma, or target a specific group of people — especially marginalized people — deleting or “hiding” it could be seen as censorship. Effective social media moderation that reduces harm and supports your organization’s mission isn’t censorship — it creates a community where more people feel comfortable participating.



Need more specific help? Not sure where to start? Schedule a time to talk with us about your digital organizing and communication goals. To reserve a call, visit acmstrategies.com/schedule-a-call.

bottom of page