Some say it builds a back door into Apple devices, something the company swore it would never do. The new scanning feature has also confused a lot of Apple’s customers and, reportedly, upset many of its employees. While fighting against child sexual abuse is objectively a good thing, privacy experts aren’t thrilled about how Apple is choosing to do it. Apple, the company that proudly touted its user privacy bona fides in its recent iOS 15 preview, recently introduced a feature that seems to run counter to its privacy-first ethos: the ability to scan iPhone photos and alert the authorities if any of them contain child sexual abuse material (CSAM).