> and is built so transformed images hash to the same number.
That is the goal of all perceptual hashing algorithms.
> Apple's system only concerns exact hash matches
Then it is almost useless. All someone would need to do to evade the system is make minor enough adjustments to illegal images so that the distance between resulting hash and the original is minor.
In reality, perceptual hashing systems use fuzzy matching on hashes by using a metric like hamming distance to calculate differences between two or more hashes.
Or here’s an idea: don’t import CSAM into your camera roll in the first place? This is what I don’t get. I’ve never felt compelled to import regular porn into my camera roll–and that stuff is legal. Who the hell is going to be doing that with stuff they know will land them in prison?
I can believe that some people might be stupid enough to believe that a private Facebook group was secure. But who the hell co-mingles their deepest darkest dirtiest secrets with pictures of their family and last night’s dinner?
I'm sure you're in a prime position to judge that.
> In reality, perceptual hashing systems use fuzzy matching on hashes by using a metric like hamming distance to calculate differences between two or more hashes.
Again, Apple's NeuralHash purposefully doesn't work like that [0].
That is the goal of all perceptual hashing algorithms.
> Apple's system only concerns exact hash matches
Then it is almost useless. All someone would need to do to evade the system is make minor enough adjustments to illegal images so that the distance between resulting hash and the original is minor.
In reality, perceptual hashing systems use fuzzy matching on hashes by using a metric like hamming distance to calculate differences between two or more hashes.