> That if you make a 'free speech' drive/repository place that's widely available, it will host the absolute worst of the human race.
That's only due to selection effects. If being open were the default then they'd be diluted among all the other people. ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
> For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries?
I don't see how that's an issue? They send a court order, you take down the content is a perfectly reasonable default procedure. For some categories of content there already exist specific laws which require takedown on notification without a court order, which exactly depends on jurisdiction of course, in most places that would be at least copyright takedowns and child porn.
> Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
That's pretty much what telcos have to deal with for example. Supposedly 4chan also gets requests from the FBI every now and then. It may be a nuisance, but not some insurmountable obstacle. For big players this shouldn't be an issue and smaller ones will fly under the radar most of the time anyway.
Also, having stricter policies doesn't make those problems go away. People will still post illegal content, but now in addition to dealing with the FBI you also need to deal with moderation policies, psychiatrists for your traumatized moderators (which you're making see that content) and endusers complaining about your policy covering X but not Y or your policy being inconsistently enforced or whatever.
>ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
This isn't true at all, and the reddit report following their ban wave is pretty clear about it; once areas that actively established a standard of violent or racist discourse as acceptable were banned, the volume of objectionable material across the site dropped.
4ch had a similar situation, where the culture on /b/, which was intentionally left as an explicitly unmoderated segment of the site, a silo, actively invaded other boards with violent, racist content.
It isn't that people sit in silos and do nothing otherwise - it's that the silos themselves cause people to believe their content is acceptable, then spread that shit everywhere.
I wrote "mostly siloed" not "perfectly siloed". This is no different from real life where your social sphere is not perfectly insulated from other social spheres. Perfectly siloed also means filter bubbles.
I think this is a really good point, and I think that if anyone is really committed to promoting free-speech-maximalist approach to the web they should be focused on building tools that make is easier for people to host and distribute their own content without relying on a centralized service.
Any business with the technical ability to censor what they host is going to be tempted (and likely pressured by other actors) to take down content that people find objectionable. Removing these "chokepoints" where a small number of people have the ability to engage in mass censorship is key if you want to promote more diverse speech on the web. (Not everyone has this goal!)
That's only due to selection effects. If being open were the default then they'd be diluted among all the other people. ISPs themselves, (older) reddit, 4chan all serve as examples that the people you don't want to talk to can be mostly siloed off to some corner and you can have your own corner where you can have fun. Things only get problematic once you add amplification mechanisms like twitter and facebook feeds or reddit's frontpage.
> For one thing, it's easy to say 'well we'd only take down illegal content'. But in practice there isn't such a bright line, there's lots of borderline stuff, authorities could rule something posted on your site illegal after the fact- lots of these situations are up to a prosecutor's judgement call. Would you risk jail to push the boundaries?
I don't see how that's an issue? They send a court order, you take down the content is a perfectly reasonable default procedure. For some categories of content there already exist specific laws which require takedown on notification without a court order, which exactly depends on jurisdiction of course, in most places that would be at least copyright takedowns and child porn.
> Pretty soon the FBI & CIA start contacting you about some of the actual or borderline illegal content being hosted on Free Speech Drive. Do you want to deal with that?
That's pretty much what telcos have to deal with for example. Supposedly 4chan also gets requests from the FBI every now and then. It may be a nuisance, but not some insurmountable obstacle. For big players this shouldn't be an issue and smaller ones will fly under the radar most of the time anyway.
Also, having stricter policies doesn't make those problems go away. People will still post illegal content, but now in addition to dealing with the FBI you also need to deal with moderation policies, psychiatrists for your traumatized moderators (which you're making see that content) and endusers complaining about your policy covering X but not Y or your policy being inconsistently enforced or whatever.