Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There are multiple features. This particular one is about client-side scanning before upload to Apple's iCloud Photo service, which is optional to use.

Presumably it is client side so that they can do anonymization/encryption of photos on the server, and treat any data access outside the account and account they have shared the photo with as an audited and cross-organizational event.

But if you want to use another hosted service, you can... and likely get their implementation of a similar system. Presumably this is US regulatory compliance.



Who would back up their phone to the cloud if they were involved in illegal activities?

This is going to bite more innocent people through false positives than criminals who already know how to get away with these things.


Presumably the cloud synchronization checks are not a feature Apple wanted to add, but one which they had to under US regulations. Other providers have done this for years server-side, but Apple needed a different approach since the photos are E2E encrypted[1]

It is not a ML model but a list of known image hashes, and is only enabled for US-based accounts, furthering my suspicions this was minimum-effort for regulatory compliance.

Note they _do_ have a feature (also announced today) that uses ML models, but it is meant for local filtering and parental controls/notifications. This feature is also US-only and the parental notifications policy is fixed and age-based. I believe this is both to fit into regulations (e.g. US recognition of rights based on age) and into cultural norms.

I suspect they will have different rules in different jurisdictions when this rolls out further in the future.

[1]: With separate key escrow HSMs for account recovery and legal compliance with e.g. court-ordered access.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: