by tomkwok on 5/22/16, 2:46 PM with 58 comments
by trowawee on 5/23/16, 2:07 AM
by Pengwin on 5/23/16, 1:58 AM
1. Outline rules for a community, make them clear and understandable
2. Enforce those, and only those, rules.
3. If something objectionable happens, and the rules don't cover them but you think they should, then change the rules to how you see fit.
4. Handle the fallout of the rules changing.
Steps 3 and 4 are where things mostly fall apart in communities. Personally, any time something happens which leads me to change rules for something, it's reactive and is the voice of the majority of people (not the loudest, the majority). Most people are happy that way.
Disagreement is never a rule I'd have. I'd only make sure that people either have the tools to self-censor what they don't want to see, or tell them they might be happier going elsewhere.
by CookieMon on 5/23/16, 2:58 AM
In my experience it's hardly ever been the community showing someone the door, it's always a corporation getting cold feet over potential controversy, or some narrow-minded zealot abusing their janitorial moderator powers.
Last time this happened it was the community that overwhelmingly wanted to show the zealot the door (a vote happened while the mod was away), but they couldn't - they were nice normal people and not the kind of weasels who dedicate hours of janitorial work for the chance of having veto power over others in an internet forum.
Moderation and censorship can be separated - readers can be given the option to circumvent the moderation when/if they choose, or they can have the option to "unsubscribe" from the actions of moderators they disagree with. To create genuine public spaces we are going to have to find a way that doesn't assume uncorruptible integrity of moderators (the role attracts weasels). Private websites can certainly behave however they want, but they are currently standing in for our public spaces - and presenting themselves as such, which means we have none.
by daurnimator on 5/23/16, 2:45 AM
by esbranson on 5/23/16, 2:28 AM
by JakeAl on 5/23/16, 2:28 AM
by paulddraper on 5/23/16, 3:57 AM
Censorship: the practice of officially examining books, movies, etc., and suppressing unacceptable parts
Moderation is censorship.
To say otherwise is to make a distinction without a difference.
Rephrase your question to have meaning. Perhaps "When is censorship justifiable?"
by dragonwriter on 5/23/16, 4:12 AM
by dredmorbius on 5/23/16, 3:38 AM
The first question is: what is a given community for, and how should it accomplish those goals? Individual moderation, collaborative filtering, individual killfiles, expert ratings systems, etc., are all tools toward these ends. They're non-trivial.
https://www.reddit.com/r/dredmorbius/comments/28jfk4/content...
Community behavior is very strongly dependent on BOTH scale AND founder cohorts.
A group with one person (blog, winking in the dark) is different from one with two, or a small set of people engaged in discussion (say 3-30), or a larger group discussing common topics (say 30-100), etc. Part of this can be thought of as a cost function in which the positive contribution of members falls with scale, while the cost of each additional participant is rather more constant. Eventually, adding more participants makes the experience worse for all.
It's really difficult for any one conversation to have more than a few key participants. Two and a moderator, or perhaps 5-6 participants who know each other well and get along.
If you're trying to arrive at some truth or understanding, it's really difficult not to have a truth-based moderation criterion.
Individual killfiles are somewhat useful, except that the killfilee tends to see a great many one-sided discussions (others interacting with those they've filtered). Unless the system blots out both sides, this accomplishes little.
I've started looking more at discussion tools which foster both smaller and larger groups. "Warrens" and "plazas". The idea of creating persistent communities well below Dunbar's number (say 50-300 people) and promoting up material from those has some appeal (you'd also want to allow for lateral movement of individuals). A small set of tiers would easily accomodate the entire global population. (And yes, there's a social network set up on this basis though the name escapes me.)
A huge problem with allowing noisy participants is that they draw the oxygen out of the room, and tend to very strongly discourage high-quality particpants. There's a curiously persistent asymmetry between an individual's proclivity to participate in a group discussion and the interest in others in their doing so. Good designs balance this mismatch.
by green_lunch on 5/22/16, 3:29 PM
Moderation should be about removing the trolls, not what it has become, which is censorship.
It feels like many people personally enjoy moderating down disagreements because they don't get any actual power in their own lives.
It's one of the reasons I stay away from most online discussions these days: because nobody can be open and honest. It really makes me wonder if this is the reason why secret groups like the masons were created. Back in those days, instead of getting moderating down, you were killed or attacked.
by CookieMon on 5/23/16, 4:23 AM
by lotsoflumens on 5/23/16, 1:38 AM