I don’t know if you need this info, but I was pretty disturbed to see unexpected child pornography on a casual community. Thankfully it didn’t take place on SLRPNK.net directly, but if anyone has any advice besides leaving the community in question, let me know. And I wanted to sound an alarm to make sure we have measures in place to guard against this.
removed by mod
Mods and admins care, but we’re not all online all the time.
You are right on the point! We are all do this in our free time and we are searching for admins that are free in a timezone we still dont have covered yet.
We are open if someone is interested in assisting us, just hit us with an email with some details about you and when you can be active on lemmy.world.
That’s pretty shocking.
What tools are available to us to manage this?
The best tool that is currently available is lemmy-safty AI image scanning that can be configured to check images on upload or regularly scan the storage and remove likely csam images.
It’s a bit tricky to set up as it requires an GPU in the server and works best with object storage, but I have a plan to complete the setup of it for SLRPNK sometimes this year.
This is probably the best option; in a world where people use ML tools to generate CSAM, you can’t depend on visual hashes of known-problematic images anymore.