Apple’s PLAN To SCAN Your Photos [Their REAL Motive]

David & David discuss Apple‘s plan to scan everyone’s iCloud Photos for CSAM, or child sexual abuse material. It’s a complicated issue. Of course we believe in protecting children from predators — but at what cost?

And is Apple really doing this for the children, or for themselves? We dive into Apple’s plan to scan everyone’s photos for matches for child pornography, and talk about the potential ramifications of rolling out this image scanning algorithm to the masses.

LEAVE A REPLY

Please enter your comment!
Please enter your name here