HN2new | past | comments | ask | show | jobs | submitlogin

In this TechCrunch interview, Apple believes it is less invasive since no one can be individually targeted.

The hashes are hard coded into each iOS release which is the same for all iOS devices. The database is not vulnerable to server side changes.

Additionally, FWIW, they do not want to start analyzing entire iCloud photo libraries so this system only analyzes new uploads.

https://techcrunch.com/2021/08/10/interview-apples-head-of-p...



>The hashes are hard coded into each iOS release

Do you have a source on that? Since it is illegal to share those hashes in any way or form. Even people working with photo forensic and big photo sharing sites cannot get access to them. I very much doubt Apple can incorporate them into the iOS release without breaking multiple laws. The hashes themselves can easily be reversed to (bad quality) pictures so having the hashes equals having child pornography.

Edit:

https://www.hackerfactor.com/blog/index.php?/archives/929-On...


From the interview I linked, Apple Privacy head Erik Neuenschwander said, “The hash list is built into the operating system, we have one global operating system and don’t have the ability to target updates to individual users and so hash lists will be shared by all users when the system is enabled.”

Where did you hear sharing hashes is illegal? How would anybody determine whether CASM at scale without those hashes?

Your hackerfactor source states, “In 2014 and 2015, NCMEC stated that they would give MD5 hashes of known CP to service providers for detecting known-bad files.”


The important part about the hackerfactor link is that the author claims he was able to reverse PhotoDNA hashes into images. Of course Apple has their own, different perceptual hashing algorithm NeuralHash which they use, but if the hashes of a similar system can be reversed into images, maybe NeuralHash hashes can be reversed as well. Therein lies the problem.

Edit: i.e. OP isn't talking about MD5 hashes


NCMEC will share MD5 but not the hashes used for perceptual matching.


I think you're forgetting that Apple isn't using PhotoDNA. Just because PhotoDNA hashes can be reversed into images, doesn't mean Apple's perceptual hashes can be reversed into images. I think there's a good chance they can be though.


No, these hashes can’t be reversed to an image. They’re not CSAM and therefore not illegal. That blog is not very good, either from a tech standpoint or a legal one.


They are perceptual hashes of CSAM images no? Sure, just because they are perceptual hashes, doesn't mean they can be reversed to an image, but it seems to be true PhotoDNA, a similar perceptual hashing algorithm, can be reversed to an image. For this reason, I believe it is possible Apple's perceptual hashes can be reversed as well, though, maybe they did it in a different way and it's not possible


PhotoDNA is Microsoft’s algorithm. Apple’s new one is NeuralHash. Plus the hashes are encrypted and blinded before being stored on the phone. They can’t be reversed.


In order for this system to work, Apple has to be able to compare the CSAM NeuralHash hashes against the NeuralHash hashes of the images to be synced with iCloud. How do they compare the hashes without decrypting?


They compare the encrypted forms of the hashes, the first step on the device and the second on the server.


How does one compare encrypted forms of hashes?


You hash images in the same way, then encrypt the hashes in the same way, and see if the results match.


Then you are just comparing hashes. I don't see what you gained by encrypted the hash.


That was very insightful article from a legal aspect. Strongly recommend others read this to understand more nuanced opinion.


> Since it is illegal to share those hashes in any way or form

Source? (The link you provide does not claim that, as far as I could see.)


The article claims that photoDNA is reversible to 26x26 images and claims that the hashes are therefore CP.


Ah, right. But Apple is using NeuralHash, not photoDNA, right? Does that suffer the same problem?


We don't know but I think there's a good chance it does. It's important to determine before release. Maybe it's possible to encrypt them in such a way, no one can access them despite being on the device and still be able to use them for comparison


Yes, that's what the paper A Concrete-Security Analysis of the Apple PSI Protocol from UC San Diego claims:

> Reciprocally, the database of CSAM photos should not be made public or become known to the user. Apple has found a way to detect and report CSAM offenders while respecting these privacy constraints.

https://www.apple.com/child-safety/pdf/Alternative_Security_...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: