Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.
The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.
The post Apple confirms CSAM detection only applies to photos, defends its method against other solutions appeared first on 9to5Mac.
Find A Teacher Form:
https://docs.google.com/forms/d/1vREBnX5n262umf4wU5U2pyTwvk9O-JrAgblA-wH9GFQ/viewform?edit_requested=true#responses
Email:
public1989two@gmail.com
www.itsec.hk
www.itsec.vip
www.itseceu.uk
Leave a Reply