Apple drops controversial plans for child sexual abuse imagery scanning

Apple logo illustration Illustration by Alex Castro / The Verge

Apple has ended the development of technology intended to detect possible child sexual abuse material (CSAM) while it’s stored on user devices, according to The Wall Street Journal.

That plan was unveiled last fall with an intended rollout for iOS 15, but backlash quickly followed as encryption and consumer privacy experts Read Entire Article

© 2024 Thiratti. All rights reserved.