Communications companies are the main actors capable of detecting explicit videos or videos that are child sexual abuse material. Currently they are encouraged to do this voluntarily -- a practice that has existed for over a decade and proved essential for child protection. Many major investigations in recent years have been made thanks to this approach.
The Commission proposal for a regulation to prevent and combat child sexual abuse[1] seeks to make this approach mandatory. The proposal is clearly defined, the terms are closely monitored by data protection authorities and provide guarantees. A similar approach is already being widely used to detect malware and spam.
The proposal includes provisions on detection orders, which are measures of last resort when prevention measures are insufficient. They target only high-risk services, or, where possible, specific users or groups.
Such orders are issued by a judicial or independent administrative authority balancing all the rights at stake and after a thorough assessment of their necessity and proportionality.
The proposal is technology neutral, while acknowledging encryption as an important tool to guarantee the security and confidentiality of the communications of users .
When executing a detection order, providers are required to put in place the requisite safeguards to ensure the security and confidentiality of the communications of users.
The impact assessment[2] analysed and assessed the risk of false positives, which for some technologies is estimated at no more than 1 in 50 billion[3].
Nonetheless, t he proposal includes safeguards to address such risks, including requiring providers to deploy technologies that are the least privacy-intrusive in accordance with the state of the art in the industry, and that limit the error rate to the maximum extent possible, as well as judicial redress if needed.