Back in 2021, Apple announced a number of new child safety features, including Child Sexual Abuse Material (CSAM) detection for iCloud Photos. However, the move was widely criticized due to privacy concerns. After putting it on hold indefinitely, Apple has now confirmed that it has stopped its plans to roll out the CSAM detection system.
The post Apple confirms that it has stopped plans to roll out CSAM detection system appeared first on 9to5Mac.
Find A Teacher Form:
https://docs.google.com/forms/d/1vREBnX5n262umf4wU5U2pyTwvk9O-JrAgblA-wH9GFQ/viewform?edit_requested=true#responses
Email:
public1989two@gmail.com
www.itsec.hk
www.itsec.vip
www.itseceu.uk
Leave a Reply