Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think moderation only works when individuals have the agency to choose for themselves what content/posts they see. Mastodon/fediverse sets a good example here - there is “general safety and theme” guards at instance level but whether you see “uspol” in your timeline or just posts of cat pics is entirely up to you.

Contrast this to the “medias” like Threads, Bluesky, etc - moderation becomes impossible just because of the sheer scale of it all. Somehow everyone feels compelled to “correct someone who is wrong” or voice an opinion even when the context does not invite one. This is just a recipe for “perpetual engagement”, not actual platform for social interaction (networking).



As someone who worked on a fedi platform, I really appreciate those words.

Some UX decisions even attempt to "passively moderate" content, which unfortunately also deter some from actively using the platform as they don't get the as much the feel for the "crowd". For example, not showing the amount of "likes" of a post unless you interact with said post goes a long way in preventing mob-like behaviour.

Little stuff like this adds up. But it is hard to sell...


I suspect that moderation is something that “AI” may eventually be quite good at.

But human or machine, open or closed, moderation will always be biased. Each community will have its culture, and moderation will reflect that.

A community of Nazis will moderate out stuff that many places would welcome. Some communities would moderate anything not cats on Roombas.

HN is one of the best-moderated communities I’ve ever seen, yet, it has its biases. Organic moderation reflects cultural biases. I’m not always happy about what gets put on the chopping block, but I usually can’t argue with it, even if I don’t like it, or am cynical about why it’s nuked. I stick with HN, because, for the most part, I don’t mind what gets moderated. The showdead thing lets me see what gets nuked, but I usually like to leave it off. I’m not really one for gazing into the toilet.

The main thing that an open fediverse can bring, is transparent moderation, so folks will know when stuff is being blocked, and can use that knowledge to decide whether or not to remain in the community, or advocate for change.


> Contrast this to the “medias” like Threads, Bluesky, etc - moderation becomes impossible just because of the sheer scale of it all.

Wut ? Moderation at Bluesky is fantastic: users build their block lists and share them for others to subscribe to - moderation à la carte... Power to the users !


More like hermetic narrative security at scale.

That would change for the better when BlueSky itself only manages legal prohibitions and lets Everything Else be an optional layer.

While the hermetic narrative security would still be there to split people, it would only split by optional layers.


I had two accounts banned from BlueSky and they didn't say why. One was parodying Donald Trump so fair enough if they don't want content like that, and they told me it was banned for impersonating Donald Trump. The other, no idea at all because I don't think I even tweeted anything very controversial, and the email was just a very generic "you violated terms of service". My third account was not banned, but I don't use BlueSky any more. It's not a ban-evasion ban, since they're logged in together in the same web browser, with the menu to switch accounts active, and yet my third account was not banned.

My point of sharing this info is that BlueSky is not a user-driven moderation system. It arbitrarily and centrally bans accounts, just like Twitter.


You're right, Bluesky moderation is centralized. Unless content is served p2p, some moderation has to be centralized. At the end of the day, there's a server serving content and that server operator is legally obligated to remove illegal material.

Hopefully, atproto + community will provide alternatives for moderation services. Work is being done on this, we'll see what we end up getting. I feel that a competitive ecosystem of moderation services is probably the best answer we can hope for to that inherently messy problem.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: