HN2new | past | comments | ask | show | jobs | submitlogin

Parents take a lot of photos of their kid. Like, lots.


How many of them include erect adult penises and active participation in sex acts?

Apple's on-device list of hashes only includes images which have been classified "A1" under the CSAM categorisation scale. If any other photographs are accidental hash collisions to these images, it's going to be pretty damn obvious to the human reviewer.


The problem with all of this, is that it's about to trust. We're are supposed to trust apple's algorithm to avoid false positives, we're supposed to trust apple that even with false positives there's some threshold to cross, and then supposed to trust apple that their employees will do a good job verifying the pictures, and then(most imporantly) trust that a giant american corporation won't honor secretive state requests to start scanning our phones for other data.

I just don't have that trust. Obviously I think that protecting children is an incredibly important thing to be doing, but I don't trust apple to be running such a system(I do trust them maybe a tiny bit more than I'd trust google in this case, but ultimately I'd rather this system didn't exist at all).


I trust that Apple knows that an actual, real world false accusation will make this week's media challenges look like a fleabite.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: