Also yes, I am alive. Sorry being partially dead and not super engaging this past week. I’ve actually been very sick IRL and just generally taking backups and making sure things don’t explode while I fight off whatever illness I’ve been dealing with. Not sure if it was COVID but at least I have not succumbed to pestilence and I have no plans to in the future.
But yeah we got approved for NCMEC’s hash scanning system hooked into Cloudflare and I just wanted to explain what it is and give a heads up for transparency while I wait for the on-boarding Zoom call with them in a few days.
So… what IS that?
Basically every image uploaded online has a unique fingerprint signature called a “hash”. With this, it makes it possible to take the known hashes of CSAM (child sexual assault material) which can be safely stored in carefully controlled digital databases. There are a few databases that exist on the market as well as around the world, but the one we will be using is provided by NCMEC (National Center for Missing & Exploited Children) in the US and it is automatically hooked into Cloudflare.
Basically images that are uploaded as well as those federated into the instance will now be scanned with this hash system automatically to check if they are in the hash scanning database we now have access to. If they are matched as similar enough, the images will be automatically blocked, I will be emailed the source URL, an automated report will be made with NCMEC, and I will be able to follow up based on where the URL was found and be able to reach out to the instance it was found on (if it was not on ours)
This system is automated. Your images themselves are not scanned and this only affects images.
There is some privacy concerns within using this, yes. I am personally not too keen on using a software that’s made in part by big tech giant Microsoft. But given recent events over the past few months, having this implemented is kind of non-negotiable for me to feel comfortable continuing to host a lemmy instance. I hope you understand, and recognize the importance of this tool.
I am hopeful for the future of lemmy though, especially with upcoming changes and features to address many concerns that have arisen recently. This is very much an experimental software, and we have to make do with what we have. I will also add info to our /legal page informing users of our usage of this tool as well.
It should also be noted these databases are heavily controlled to make sure bad actors who could potentially begin stress testing them do not get gain access to it. When I said I have to have an on-boarding call with NCMEC, that’s not a joke. In order to be able to use this I have to have a 30 minute long digital meeting with a representative of a government affiliated non-profit so they can be sure I am using this for the right reasons. I am dreading it, but thankfully the people at NCMEC are extremely kind and approachable.
https://developers.cloudflare.com/cache/reference/csam-scanning/
Isn’t most everything(other than dms) on lemmy publicly accessibly anyway? I doubt is all that difficult for cloudflair to scrape public images and hashes don’t contain much useful information regardless
Yeah, it’s mostly just peace of mind. Every bit helps, you know?
deleted by creator
Good on you. Hope more follow suit.
That is fascinating and I will be following this keenly.
It is now enabled! Prayers it doesn’t activate, but if it does it can be taken care of quickly :)
They also have a pretty cool API. It seems like it could be easily hooked into lemmy itself via the frontend or via an alternative frontend if someone was willing to do the work on it.