Apple Officially Cancels Its Plans to Scan iCloud Photos for Child Abuse Material

Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).

Apple has officially killed one of its most controversial proposals ever: a plan to scan iCloud images for signs of child sexual abuse material (or, CSAM).
| Source | RSS or Atom Feed |
| Feed Location | http://gizmodo.com/rss |
| Feed Title | Gizmodo |
| Feed Link | https://gizmodo.com/ |
| Reply | 0 comments |