Children make up a significant portion of internet users. It is therefore “imperative to create a secure online environment,” stated Martin Sas from the Center for IT and Intellectual Property Rights at the Catholic University of Leuven and Jan Tobias Mühlberg, Professor of Cybersecurity and Data Protection at the Free University of Brussels, in a statement published study for the Green Party group in the EU Parliament. At the same time, however, they point out that there is currently “no old-age provision method that adequately protects the fundamental rights of the individual.”
Advertisement
Most providers of online content that should be accessible only to adults “rely on self-declaration of age without further verification,” according to the research. This has proven to be ineffective and easy to circumvent. As a result, governments around the world have been pushing for the introduction of robust online age verification systems (AVS). In Europe, too, some laws have already come into force or are being drafted.
However, scientists do not consider the technology to be usable in democratic societies. “The risks associated with implementing retirement security include invasion of privacy, data leaks, behavioral surveillance, identity theft and reduced autonomy,” they conclude. Furthermore, none of the relevant procedures examined were able to confirm “the age of the user with certainty”. However, the implementation of such measures could “exacerbate existing discrimination against already disadvantaged groups in society, widen the digital divide and lead to further exclusion.”
Significant security and exclusion risks
According to the analysis, promising techniques for protecting privacy such as playing with digital identities are in development. These could enable anonymous age checks. However, this would also pose significant security and exclusion risks. There are also a number of challenges that still need to be overcome in implementing these technologies. There is currently no Europe-wide technical and legal framework that supports its widespread introduction.
According to the authors, in order to guarantee the fundamental rights of all users on the Internet, there is “an urgent need for binding risk assessments”. These would have to contain impact assessments, for example on freedom of expression, freedom of discrimination and freedom of assembly, data protection and children's rights. You should aim to find a balance between all of these demands. A comprehensive framework of standards, certification systems and independent controls is also needed “to ensure the security and trustworthiness of pension protection measures and the accountability of technology providers”.
Mandatory age controls online could “restrict the ability of individuals to express themselves freely and to come into contact with others,” the authors emphasize. This particularly affects marginalized population groups who do not have the option of electronic identification or for whom automated facial recognition has proven to be technically or personally impractical.
Children need to be “vaccinated” for problematic content
Last but not least, according to the study, mandatory AVS could hinder the development of children themselves “by denying them access to certain content or services.” Relevant resources often help young people “improve their skills and media literacy.” This is especially true when recognizing and dealing with certain risks, for example in social media or when dealing with difficult personal life situations. With this in mind, alternative protection measures such as safer algorithmic recommendations, warnings about harmful content or “panic buttons” may be better suited to supporting children as they explore the online world.
Overall, the researchers show “a mismatch between the urgency” with which governments want to introduce age controls and the time required to develop robust, secure and trustworthy AVS. The greatest danger lies in the introduction of security solutions without adequate protection of fundamental rights, which could lead to excessive invasions of privacy and an increased risk of data leaks and misuse throughout the online world.
With its push for chat control, the EU Commission is currently pushing for all providers of hosting or interpersonal communication services to carry out a risk assessment of the possible misuse of their services for the distribution of known or new material about child sexual abuse or for grooming. They should also describe any remedial measures that have already been taken, such as the use of AVS. App store providers would have to carry out age verification. The EU Parliament wants to lessen these requirements.
Age control in politics en vogue
The revised Audiovisual Media Services Directive (AVMD) also requires strict measures such as the use of age verification systems in the fight against violent pornography. This means that streaming platforms in particular are being used more heavily. The Irish media regulator CNAM is currently pushing forward a law that would require platform operators to carry out age controls on this basis. YouTube, TikTok and Meta (Facebook and Instagram) have their European headquarters in Ireland.
For very large platforms, the recommendation from the Digital Services Act (DSA) also applies to introduce AVS in order to reduce systemic risks. Furthermore, the planned reform of the State Treaty on the Protection of Minors in the Media (JMStV) is intended to oblige providers of operating systems such as Google, Microsoft and Apple to install filters if they are “usually” used by children and young people. Website operators must indicate an age classification “through a clearly visible label at the beginning of the offer”.
In Great Britain, child protection will be tightened again with the Online Safety Bill. With the law passed in the fall, in addition to erotic portals such as Pornhub or xHamster, operators of social networks and other services that potentially distribute pornographic content are also expressly obliged to use “highly effective” measures for age verification or estimation.
Felix Reda from the Society for Civil Rights (GFF) strongly warned last year against an obligation to verify age. This would not only undermine anonymity online. In addition, children often have no means of identification at all, so only a very invasive and error-prone “biometric recording” would be an option. Alternative app stores such as F-Droid and many open source projects could not handle age verification with their decentralized structures.
(my)