Apple confirms CSAM detection only applies to photos, defends its method against other solutions

Apple continues to offer clarity around the CSAM (child sexual abuse material) detection feature it announced last week. In addition to a detailed frequently asked questions document published earlier today, Apple also now confirmed that CSAM detection only applies to photos stored in iCloud Photos, not videos.

The company also continues to defend its implementation of CSAM detection as more privacy-friendly and privacy-preserving than other companies.

more…

The post Apple confirms CSAM detection only applies to photos, defends its method against other solutions appeared first on 9to5Mac.

\"IT電腦補習
立刻註冊及報名電腦補習課程吧!

Find A Teacher Form:
https://docs.google.com/forms/d/1vREBnX5n262umf4wU5U2pyTwvk9O-JrAgblA-wH9GFQ/viewform?edit_requested=true#responses

Email:
public1989two@gmail.com






www.itsec.hk
www.itsec.vip
www.itseceu.uk

Be the first to comment

Leave a Reply

Your email address will not be published.


*