/r/ModerationTheory

Photograph via snooOG

A forum for the discussion of the moderation theory in reddit's subreddits. message the mods for submission privileges.

/r/ModerationTheory is a subreddit for the discussion of moderation, moderation policies, moderation case studies, and general moderation philosophies on reddit.

We welcome both general discussions and questions, and tactful references to specific subreddit moderation policies and moderation tools.

This is a space for assuming and exerting good faith towards and on behalf of everyone on reddit. Discussions take place under the assumption that moderators make internal compromise and are the only users privy to all information concerning specific moderation decisions in the subreddits they moderate. As onlookers we may respectfully disagree with that in mind.

/r/ModerationTheory is a largely self-moderated space where civil, on-topic discussion using friendly and respectful language is expected of all participants.

Moderators may step in if needed, such as in cases of insults or assuming bad faith.

/r/ModerationTheory

518 Subscribers

0

making more than 2 posts on your first day, or 3 posts in one day subsequently seems to get you shadow banned on Reddit. Is that right? I know it's not exactly right.

I have OCD and autism post at an alarming rate because some things just pickle my special interests. Rather than getting shadow banned time after time, could someone break down the posting speed limits?

New account, hence you won't see that on this one.

1 Comment
2023/09/19
09:09 UTC

5

Does banning a user who posted in other subreddits work? How common is it?

A few days ago I was banned from posting to r/LeapordsAteMyFace because I posted to r/LockDownSkepticism. Is this common? How would users know which subreddits are verboten? And, most importantly, is it an effective type of moderation? I've never posted to r/LeapordsAteMyFace, and never expect to, but I do study online communities and find this interesting.

As I wrote the mods:

Hello, I appreciate your concern with misinformation and my post on /r/LockdownSkepticism spoke to the ease of getting vaccinated. Consequently, while I expected to be downvoted there, I'm surprised to be banned in /r/LeopardsAteMyFace. This concerns me as a user -- and interests me as a researcher who studies online communities:

  • Is this policy of yours stated anywhere?
    • What other subreddits are included?
    • How long has this been your policy?
  • And, as a researcher, do you have evidence that the policy is in someway effective?
    • Does it somehow lighten your moderation load (e.g., preventing any participant there from brigading here)?
    • Do you believe it limits misinformation? (I suspect not, as people can easily use multiple accounts and this action could prompt a backfire effect.)

They responded, "Thou shalt not sealion."

3 Comments
2022/01/14
18:03 UTC

2

Research paper: Title: When Power Goes Wild Online: How Did a Voluntary Moderator’s Abuse of Power Affect an Online Community? (It’s a reddit community.)

Link is as follows:
https://cdr.lib.unc.edu/downloads/cz30pz987?locale=en

Edit: I haven‘t participated in that community.

1 Comment
2021/11/18
17:36 UTC

0

A moderator who violated reddit‘s content rules is allowed to remain moderator and hence still able to ban users! I think that this is absurd/ unjust! Do you agree?

Reddit’s admins removed a moderator‘s comment from the sub, where he is one of the mods. I think that either the admins or the other moderators of the sub should have removed him from that sub’s moderation team. Asking for your comments!

Edit: I wonder, if the admins informed the other moderators of the sub that his comment was removed?

UPDATE: 8 months later: In the meantime the account of the moderator was suspended! I don‘t know anything about the circumstances of the suspension. Besides I don‘t know, when it was suspended.

5 Comments
2021/11/17
22:15 UTC

2

Do moderators of a subreddit decide together, if a user should be banned/ unbanned?

Many subreddits have a moderation team.
Who decides/ should decide, if a user should be banned/ unbanned? Do they decide together? Does every moderator decide independently? Does the lead moderator decide?

3 Comments
2021/11/17
20:57 UTC

1

What to do against a mod supporting white terrorism ?

should I hold steve huffman responsible for that ?

1 Comment
2021/11/10
23:01 UTC

4

27 days ago "User Shadowban List" became "User Bot Ban List" in the AutoModerator Library of Common Rules

Reddit has been moving away from shadowbanning humans for years now, and today I noticed they changed "User Shadowban List" to "User Bot Ban List" in the AutoModerator Library of Common Rules.

3 Comments
2020/07/04
16:34 UTC

3

if you find a spammer, should you preemptive ban?

in my opinion, yes, if you find someone who

  1. has negative karma (if you count comments)
  2. has an account that's only a month old
  3. has 4 posts
  4. has tons of comments that are just subscribe to pewdiepie spam

then ya ban them

2 Comments
2019/04/14
07:34 UTC

4

Analysis of a mod bot I've been using

3 Comments
2019/01/12
23:56 UTC

8

Using AutoModerator on Reddit

Hi r/ModerationTheory,

I’m a graduate student at the Oxford Internet Institute researching AutoModerator and currently in the process of conducting short interviews with moderators who use the bot.

If you’re experienced with AutoModerator, I’d be interested in learning more about how you started using it and how it has changed your day-to-day as a moderator. If you’re less experienced, I’d still be interested in talking, especially to learn more about any barriers to entry that you’ve encountered.

My preference is for phone interviews, but I’ve included some questions below that you can answer in a private message. If you’re up for a phone call, message me and we’ll schedule a time. You can learn more about the project and how interviews are conducted here: https://www.dropbox.com/s/1k5ziop83qji1dh/AutoMod_info_form.pdf?dl=0.

Thanks in advance!

Questions:

How did you first learn about AutoModerator and why did you adopt it? Was there a specific incident that drove you to use it? What did you hope to accomplish with it?

What kinds of rules have you used it to enforce (i.e. spam, formatting, shadow banning, etc) and where have you found it to be most impactful? Where has it failed? Did you borrow rules from other subs or develop them on your own?

How has it changed your day to day as a moderator? Has it changed the ways in which you interact with users in your subs?

Have you noticed a difference in the ways users respond to AutoModerator as opposed to you or other human moderators?

0 Comments
2018/06/28
17:44 UTC

7

The Grammar Bot's

I run a Dyslexic subreddit. I keep blocking bot's; Reddit needs to stop this mess. When did they become English teachers?

4 Comments
2017/08/06
22:23 UTC

8

When is enough enough?

When is enough enough?

Morally speaking, I'm well aware that banning everyone who regularly participates in a particular subreddit from your subreddit is wrong. But allow me a moment to describe a situation that I can see no other way to resolve, and then you can tell me if you have a viable path through this quandry.

I moderate /r/alcohol. A decenly growing subreddit, with a good community forming.

Like all subreddits, we have a basic set of rules. And like all subreddits, we get a fair amount of people who willfully ignore those rules, resulting in (hopefully temporary) bans.

A few months back, we started getting a rather heavy influx of people blatantly posting in violation of two specific rules we have. Namely, "shitposting" and "anti-alcohol rhetoric." In some cases, going so far as to call out those specific rules in their posts.

All these banned posters have one thing in common: they're all frequent posters in a certain subreddit dedicated to a specific illicit drug. No, I'm not naming this other subreddit, but observant readers can probably figure it out.

In fact, looking over the ban logs for the past week, I find a total of 37 bans. One of those was a spambot. The other 36 are all very frequent posters/commenters in this other subreddit.

I considered approaching the moderators of said subreddit, until I noticed that out of their six moderators, four are already on my subreddit's ban list, all for anti-alcohol rhetoric.

So simply put, what solution do we have? Pre-emptive banning is abhorrent, as well as being a logistical nightmare. Report my findings to the Admins? It's unlikely that they would even care to respond, let alone offer a solution.

I'm well aware that this is /r/ModerationTheory, so I'm not expecting a viable solution, but I do hope it raises some interesting thoughts on the subject.

7 Comments
2017/04/19
12:41 UTC

4

Having the automoderator make subjective rulings on content, because real people such as subscribers can't be trusted to think for themselves.

16 Comments
2017/03/02
13:43 UTC

5

Mods who have banned or restricted users/mods; that did not know your sub existed, much less posted to your sub: /r/offmychest

This is to mods who have banned or restricted users/mods; that did not know your sub existed, much less posted to your sub. If there is one thing that irritates me, it is this. My sub has an advocacy component, and dyslexics are everywhere: we come in all colors, political backgrounds, genders and languages. I heard rumors that certain subs are restricting, or banning based on users post to other subs; I was wondering is this true? I only ask, because I came across /r/offmychest

If this is true......... Learning disabilities are a multifaceted issue for people. This means I need to go on different subs and yes, these subs could be experiencing conflicts with one another. Nevertheless, its still important to talk and get to know people. If this is indeed happening, I would like to know how they perform this task: what algorithms are they using and does this involve volunteers.

However, I refuse to go on subs that promote child abuse, animal abuse, ect. The incles is as far as I have went and my observations: the abuse is coming from trolls, a few predators, and unethical mods

11 Comments
2016/12/26
08:28 UTC

6

What happens if a mod becomes corrupt?

I know mods hold all the power in their subs, but what would happen if a mod started banning people for bad or nonexistent reasons?

6 Comments
2016/12/01
02:35 UTC

1

How to change subreddit name? Is it possible or do I have to create a whole new subreddit?

Also non-mod question sorry but how do I change submitted thread titles after submitting said post. I know how to change description but not title. Cheers and thanks.

3 Comments
2016/10/13
16:11 UTC

8

Are mods allowed to silently (shadow) remove a comment ?

I'm sorry I just don't know in which subreddit to ask this question so I start here. I just found out that my comment isn't showing up where I posted it, even though it is in my history, while some other comments are getting upvotes/downvotes and replies. So question is can a mod remove a comment without a trace, not even showing up as [removed] or [deleted]?

17 Comments
2016/07/19
12:17 UTC

12

Is it just me or are the mods of /r/The_Donald a wee bit hypocritical?

5 Comments
2016/06/05
15:58 UTC

8

The ban on accusations of shilling in r/politics is a win for Correct the Record and other such groups

So, Correct the Record gets to send paid shills to Reddit to post for pay, and we can't even point out that it's happening? This seems to be entirely a victory for any organization or group that wants to send paid shills to Reddit.

You can't even point out that the existence of paid shills means that all posters who share that opinion are being discredited, because how can the rest of us know who's a shill and who isn't?

And there is a huge difference between an honest debate with someone whose opinion differs from yours and debating with someone who is just posting because he or she is paid to. Arguing with a shill takes all the meaning out of a debate, it's like arguing with a tape recording.

Putting all the penalties on the part of honest Reddit posters rather than shills goes against every principle Reddit supposedly stands for. If you are going to give shills free reign on Reddit, how can you forbid others from posting on Reddit purely for commercial reasons?

I realize that some people are going to accuse others of being shills as a simple strategy for discrediting them. But in the case of Correct the Record, or any group that announces that it will be sending shills to Reddit, the discrediting has already been done. Other posters are merely pointing it out. I suggest that it should be allowable to accuse someone of being a shill if it is public knowledge that an organization is sending paid shills out to online media on behalf of a given viewpoint.

The current policy is altogether a bad policy in my opinion, one that should be changed. It discredits Reddit, by making it appear to be entirely on the side of organizations that sponsor paid shills on social media.

6 Comments
2016/05/23
23:29 UTC

5

Incentives to Help Build Trust: Managing Trust Bulding Options and their Drawbacks [x-post /r/TheoryOfReddit] by /u/BuckeyeSundae head mod of /r/Leagueoflegends

0 Comments
2015/12/15
10:54 UTC

5

I'm currently running a documentary-style series of questions for moderators on /r/AskModerators. I'd like to do the same thing here, but instead dive deeper into the theory and philosophy of moderation.

In the interest of transparency: I'm creating a platform for building communities which I hope will bring something unique to the table. That, coupled with a longstanding love for online communities, has inspired this series. P.S. much of the background for this first post was taken from my series over at /r/AskModerators, you can find that post here.

Welcome to the first part of a series designed to spur discussion about the theory, philosophies and practical applications of moderation! I'm hoping that over the course of the next week I can ask you all questions that you find interesting, engaging, thought provoking, and fun.

So without further ado, the topic of my first post: Incentives for user behavior. Many community platforms have built systems to influence user behavior, and these incentives have had a huge effect on the culture and community of the sites. Reddit has karma given through a democratic voting system; a system that can be manipulated (i.e. vote brigades) for various reasons. Stackoverflow grants users greater power if they consistently engage in specific contributions; power that is occasionally abused in interesting ways. What incentives would you like to see built in a platform (reddit, forums, Q&A sites, others)? Would you like to see more rewards for users policing themselves? Is it possible to have a voting system that rewards long-form content instead of image macros (without significant moderation intervention, like /r/AskHistorians)? Is there a now defunct service that had a incentive system you long for?

Thanks for your time, looking forward to some really fascinating discussion!

27 Comments
2015/07/15
22:03 UTC

3

Ban Bargaining

The Ban Bargain is a technique to temporarily ban users, to stop them from complaining about being banned, and to curb their unwanted behaviour. If a user makes a comment worthy of a temporary ban but not a permanent ban, initially give the user a permanent ban. The user will then beg to be unbanned in modmail. Tell them you are willing to shorten their ban, if they are willing to never do whatever they were banned for again. They will happily agree, and think they were given a second chance, when in reality you were only going to temporarily ban them, anyways. Make sure to temp-ban them for long enough that they will remember the ban next time they go to make the same type of comment, but not for so long that they completely forget about the subreddit.

2 Comments
2015/06/30
04:46 UTC

3

Ban art/essays: Thoughts on the idea?

If you haven't heard of the concept before, some subreddits (In my case, /r/imgoingtohellforthis, though I've heard of it from others, including /r/askreddit.) will allow for users to have a ban shortened or removed if the user produces a specified bit of content. In /r/imgoingtohellforthis's case, we store ours publicly at /r/TalesOfIGTHFT^NSFW and have asked people for a variety of content, including erotic fanfiction and terrible MSPaint pictures. In other cases, I've heard of essays relating to the offense committed, or just art.

What are your thoughts on the idea?

If you practice it, have you measured recidivism at all and has it made a dent in it? Is the offer a regular/semi-regular one or is it a rare occasion kind of thing? If it is a regular/semi-regular offer, is the knowledge of its existence what you'd consider to be common?

4 Comments
2015/06/14
06:27 UTC

11

Why mods moderate

A particularly desperate user--who was trying to get their cop-shot-a-dog post reinstated on /r/pics after a rule violation--offered to buy gold and help bring reddit more traffic. When I told them that this doesn't affect us because we're not paid, they asked "so why be a moderator?"

I said it was like owning a Harley Davidson: if you don't know, you wouldn't understand.

Each time something controversial happens, I also see mods saying things such as "I want to improve the community/quality of discussion/etc."

I'm not so sure about that anymore, I think that we like to think this, but the real reason is much more basic and instinctual.

If you've seen an indoor cat get the "zoomies" then you've seen an animal getting a natural urge out of its system. Konrad Lorenz wrote about something similar in On Agression, where a pet starling would track an imaginary fly and then leap out to snatch it from the air. Each animal had the need to satisfy an innate compulsion, even if there was no other reason.

I've noticed that part of the human instinct to form organised groups and societies includes the urge to take on a necessary labor, and you get a lot of satisfaction from that work—no matter how trivial—because it exercises that urge until you no longer feel it.

I get uncomfortable at work when there's nothing for me to do. Why am I being paid? What if someone sees me doing nothing? Well, I'm not so sure the paranoia is really the reason why I volunteer for tasks outside my job description. I don't think it's because I'm afraid of being fired for slacking, but it is a very accessible reason to think of when anyone asks "why do you volunteer?"

Reasons like those, "I just want to improve the community", etc. are post hoc.

The cat, if able to answer "why did you just zoom around the house like bonkers for ten minutes?" might say it was because she thought it would be good exercise. A nice, rational, well-thought reason. But the real reason is because predator/prey chasing and fleeing have been baked into her nature over millions of years and scream to be expressed.

I think mods moderate because we need to feel useful and productive, that we want to be cleaning comes before wanting to see things clean. Some feel this more than others; there's a lot of variety in people.

16 Comments
2015/06/13
10:17 UTC

3

Advice: Copy-Cat Sub setup to karma farm existing sub.

Hello all,

I am a moderator of a NSFW sub that generates short-form original content daily to be viewed by our 15,000 subscribers.

I have been made aware of a new sub with a similar name, same premise, that contains 100% reposts from our sub, all posted one user.

This user has made a new sub with the same premise, then gone through a few months of our back log to fill it up, and he continues to re-post daily.

My question is what should be done about this? and how should I go about it?

3 Comments
2015/05/04
05:43 UTC

1

Release: Moderator Toolbox v3.0 'Illuminati Ibis'

0 Comments
2014/12/22
18:23 UTC

4

On recruiting new mods for a large sub

Have you had any luck recruiting new mods via r/needamod or some other means?

Do you give new mods specific tasks and guidelines?

How do you determine that they'll be a good fit?

Any other suggestions?

4 Comments
2014/12/17
03:36 UTC

Back To Top