Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m not so sure. It’s a layman’s interpretation, but I think any “forum” would be multi-risk.

That means you need to do CSAM scanning if you accept images, CSAM URL scanning if you accept links, and there’s a lot more than that to parse here.



I doubt it. While it's always a bit of a gray area, the example for "medium risk" is a site with 8M monthly users who share images, doesn't have proactive scanning and has been warned by multiple major organisations that it has been used a few times to share CSAM material.

Cases where they assume you should say "medium risk" without evidence of it happening are if you've got several major risk factors:

> (a) child users; (b) social media services; (c) messaging services; (d) discussion forums and chat rooms; (e) user groups; (f) direct messaging; (g) encrypted messaging.

Also, before someone comes along with a specific subset and says those several things are benign

> This is intended as an overall guide, but rather than focusing purely on the number of risk factors, you should consider the combined effect of the risk factors to make an overall judgement about the level of risk on your service

And frankly if you have image sharing, groups, direct messaging, encrypted messaging, child users, a decent volume and no automated processes for checking content you probably do have CSAM and grooming on your service or there clearly is a risk of it happening.


The problem is the following: if you don't have basic moderation your forum will be abused for those various illegal purposes

Having a modicum of rule enforcement and basic abuse protections (let's say: new users can't upload files) on it goes a long way


That scanning requirement only applies if your site is:

• A "large service" (more than 7 million monthly active UK users) that is at a medium or high risk of image-based CSAM, or

• A service that is at a high risk of image-based CSAM and either has more than 700000 monthly active UK users or is a file-storage and file-sharing service.


> do CSAM scanning if you accept images, CSAM URL scanning if you accept links

Which really should be happening anyway.

I would strongly prefer that forums I visit not expose me to child pornography.


You cannot get access to the tech without being a certain size to avoid people modifying images to avoid the filter.


Cloudflare has a free CSAM scanning tool available for everyone:

https://developers.cloudflare.com/cache/reference/csam-scann...


oh great so you centralize even harder and that will fix everything?


So what's your alternative to market forces?

Ggovernment regulation - "good" centralisation?


Not the person you are asking but alternatives I can think of are:

- Configure forums using ranks so that new users can post but nobody will see their post until a moderator approves or other members vouch for them. Some forums already have this capability. It's high maintenance though and shady people will still try to warm up accounts just like they do here at HN.

- Small communities make their sites invite only and password protect the web interface. This is also already a thing but those communities usually stay quite small. Some prefer small communities. quality over quantity, or real friends over bloated "friends" lists which is common on big platforms.

- Move to Tor onion sites so that one has more time to respond to a flagged post. Non tor sites get abused by people running scripts that upload CSAM, then snapshot it despite them being the ones uploading it, automatically submit to registrars, server and CDN providers so the domains and rented infrastructure get cancelled. This pushes everyone onto big centralized sites and I would not be surprised if some of them were people with a vested interest in doing so.

Not really great options but they do exist. Some use these options to stay off the radar being less likely to attract the unstable people or lazy agents trying to inflate their numbers. I suppose now we can add to the list government agencies trying to profiteer of this new law. Gamification of the legal system, as if weaponization of it were not bad enough.


> I would strongly prefer that forums I visit not expose me to child pornography.

While almost everybody including me shares this perference maybe it should be something that browsers could do? After all why put the burden on countless various websites if you can implement it in a single piece of software?

This could also make it easier to go after people who are sources of such material because it wouldn't immediately disappear from the network often without a trace.


> While almost everybody including me shares this perference maybe it should be something that browsers could do? After all why put the burden on countless various websites if you can implement it in a single piece of software?

If I recall correctly, Apple tried to do that and it (rightly) elicited howls of outrage. What you're asking for is for people's own computers to spy on them on behalf of the authorities. It's like having people install CCTV cameras their own homes so the police can make sure they're not doing anything illegal. It's literally Big Brother stuff. Maybe it would only be used for sympathetic purposes at first, but once the infrastructure is built, it would be a tempting thing for the authorities to abuse (or just use for goals that are not universally accepted, like banning all pornography).


Apple tried to do that obligatory, taking away control from user. Which of course is a terrible idea.

I don't want my browser to report me if I encounter illegal materials. I want the browser to anonymously report the website where they are, at most and even that, only if I don't disable reporting.

People do install cctv cameras in their homes but they are (or at least believe to be) in control of what happens with the footage.


So basically you want your browser to be controlled by the governement and remove ones ability to use their browser of choice?

All this because a negligible amount of web user upload CSAM?


No, I want my browser to be controlled by me. It should just be more capable so I'm not getting exposed to materials that I don't like getting exposed to and maybe easily report them if I want. Like adblock but for illegal or undesirable online materials.

> All this because a negligible amount of web user upload CSAM?

Still it's better to fix it in the browser than keep increasingly policing the entirety of the internet to keep it neglible.


Love to never be able to see photos of my child at the beach because Google Chrome tells me I'm a criminal.


Unless you tell Google Chrome it's ok and you actually want to see photos of naked children in some whitelisted contexts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: