Article 66KVE Apple drops controversial plans for child sexual abuse imagery scanning

Apple drops controversial plans for child sexual abuse imagery scanning

by
Richard Lawler
from The Verge - All Posts on (#66KVE)
acastro_180604_1777_apple_wwdc_0003.0.jp Illustration by Alex Castro / The Verge

Apple has ended the development of technology intended to detect possible child sexual abuse material (CSAM) while it's stored on user devices, according to The Wall Street Journal.

That plan was unveiled last fall with an intended rollout for iOS 15, but backlash quickly followed as encryption and consumer privacy experts warned about the danger of creating surveillance systems that work directly from your phone, laptop, or tablet.

And here's his answer on if Apple took into account the impact this would have on law enforcement and investigations https://t.co/X64rwlkMEN pic.twitter.com/lTQvC27da1

- Joanna Stern (@JoannaStern) December 7, 2022

As recently as last December, Apple said its plans on that front hadn't changed, but...

Continue reading...

External Content
Source RSS or Atom Feed
Feed Location http://www.theverge.com/rss/index.xml
Feed Title The Verge - All Posts
Feed Link https://www.theverge.com/
Reply 0 comments