HN2new | past | comments | ask | show | jobs | submitlogin

Apple is very clear that they don't know anything when photos are uploaded. The system does not even begin to tell them that some may have CSAM until it's had like 30 or so matches. The jump from this type of system (variations are used by everyone else) to some kind of child porn charges is such a reach it's really mind boggling. Especially since the very administrative entities involved are supporting it.

A strong claim (apple committing CSAM felonies) should be supported by reasonably strong support.

Here we have a blog post where they've talked to someone ELSE who (anonymously) has reached some type of legal conclusion. If you follow the QAnon claims in this area (there are lots) they follow a somewhat similar approach - someone heard from someone that something someone did is X crime. It's a weak basis for these legal conclusions.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: