Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Correct, the "1 in 1 trillion" does factor in the requirement for multiple images to match. From Apple's technical summary:

"Using another technology called threshold secret sharing, the system ensures that the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images."

"The threshold is selected to provide an extremely low (1 in 1 trillion) probability of incorrectly flagging a given account. This is further mitigated by a manual review process wherein Apple reviews each report to confirm there is a match..."

And when the manual review process sees that the images flagged aren't NCMEC classification A1 (A=prepubescent, 1=sex acts) the flag is cleared.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: