They are confident that scanning people’s digital repositories for child pornography and other illegal content cannot be implemented in such a way as to simultaneously perform two conflicting tasks – to ensure the privacy of users and to help law enforcement agencies in catching criminals.
Security researchers believe that Apple’s plans to prevent CSAM and similar proposals from the EU only expand the “powers of the state to spy”.
In other words, this is a very dangerous technology. Even if initially it will be used for scanning CSAM materials, sooner or later there will be a desire to expand the scope of its application. It is necessary to suppress the emergence of this technology at the stage of inception, otherwise it will be too late later. It will be difficult for us to find a way to resist its spread and we will lose control over this system, the report says.
The authors of the report believe that the scanning method “in itself poses serious security and privacy risks for the entire society,” and the assistance it can provide to law enforcement agencies “is not so great.” The experts noted that the client-side scanning algorithm can be bypassed and even used to the detriment of someone.
Earlier, one of the users of Reddit discovered in iOS 14.3 the official part of the code of the algorithm that will be used to scan photos and find images of child abuse (CSAM).
The user reverse-engineered the code, which allowed him to find a way to trick the algorithm and even abuse it.
At this point, Apple has yet to announce when it intends to launch its child protection features, after delaying them due to concerns from security experts. The researchers’ report consists of 46 pages, and 14 cybersecurity experts contributed to its development.