> It's true that they're not cryptographic hashes, but the false positive rate is vanishingly small.
What are you talking about? Ask anyone who has worked in this space: false positives are abound[1][2], especially when you're looking for fuzzy matches. And you have to look for fuzzy matches otherwise slight modifications to illegal images would bypass the detection system.
> Photos of your own kids won't trigger it, nude photos of adults won't trigger it;
This is also incorrect. In general, if two images kind of look like one another when you squint, they're going to have similar perceptual hashes. A lot of unrelated things look similar to one another when you squint, and a lot of unrelated things are going to have similar perceptual hashes. And, again, you'll be doing fuzzy matches on these hashes, so you're going to pick up those unrelated things even more so than when you just have hash collisions.
What are you talking about? Ask anyone who has worked in this space: false positives are abound[1][2], especially when you're looking for fuzzy matches. And you have to look for fuzzy matches otherwise slight modifications to illegal images would bypass the detection system.
> Photos of your own kids won't trigger it, nude photos of adults won't trigger it;
This is also incorrect. In general, if two images kind of look like one another when you squint, they're going to have similar perceptual hashes. A lot of unrelated things look similar to one another when you squint, and a lot of unrelated things are going to have similar perceptual hashes. And, again, you'll be doing fuzzy matches on these hashes, so you're going to pick up those unrelated things even more so than when you just have hash collisions.
[1] https://hackernews.hn/item?id=28091750
[2] https://hackernews.hn/item?id=28110159