One day after Apple confirmed plans for new software that will allow it to detect images of child abuse on users’ iCloud photos, Facebook’s head of WhatsApp says he is “concerned” by the plans.
In a thread on Twitter, Will Cathcart called it an “Apple built and operated surveillance system that could very easily be used to scan private content for anything they or a government decides it wants to control.” He also raised questions about how such a system may be exploited in China or other countries, or abused by spyware companies.
A spokesperson for Apple disputed Cathcart’s characterization of the software, noting that users can choose to disable iCloud Photos. Apple has also said that the system is only trained on a database of “known” images provided by the National Center for Missing and Exploited Children (NCMEC) and other organizations, and that it wouldn’t be possible to make it work in a regionally-specific way since it’s baked into iOS.
It’s not surprising that Facebook would take issue with Apple’s plans. Apple has spent years bashing Facebook over its record on privacy, even as the social network has embraced end-to-end encryption. More recently, the companies have clashed over privacy updates that have hindered Facebook’s ability to track its users, an update the company has said will hurt its advertising revenue.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.