While France is struggling to resolve the issue of verifying the age of minors for online content, and in particular in relation toaccess to pornography, California has just passed very strict legislation to protect the youngest. It will enter into force on July 1, 2024, and should compensate for the lack of online vigilance, which is unable to reliably and conclusively determine whether the Internet user is an adult or a minor when connecting to potentially offensive content. sensitive.
The new law requires that “any company that offers an online service intended for those under 18 or likely to be used by minors must take into account the best interests of children when designing, developing and providing its services . When there is a conflict between commercial interests and the best interests of children, companies must prioritize children’s privacy, safety and well-being over commercial interests. » Much more intrusive than GDPR, This new law raises many questions, the main one being whether this legislation is applicable and above all desirable in France. Answers with Anne Cordier, specialist in digital uses in education.
Is it possible to see such legislation in France? Is this even desirable?
In its very rigid aspect, I find it difficult to foresee a transfer as it is, it is scary. But DSA, voted by the European Parliament, and its variation in France which is the law on digital services, includes some aspects of this legislation, especially in the regulatory aspect. It is less rigid and fortunately, if only in terms of feasibility.
I do not think that this legislation is desirable in France, because the law in this case is given an educational value and power. This is not his job. However, it can help, like GDPR before it, to raise educational processes. But such a rigid law cannot replace educational support, prevention and emancipation. And then, under the pretext of protecting children, we authorize a permanent file, given that access to their profiles and personal data is possible, even desirable. It is still difficult to imagine a society where individuals prefer to delegate its regulation to a device.
Given Arcom’s difficulty in implementing European legislation, isn’t it tempting to move towards this type of much more restrictive law?
Unfortunately, yes! But the temptation is very bad indeed. The difficulty of implementation in France is mainly related to the very unclear frameworks and the lack of practical solutions. It seems difficult to establish the same controls on platforms that do not have the same technical and technological tools available. In my opinion, we are more interested in the bandage than the real problem behind it. The point is not to know how to monitor or control better, but to treat excesses at the root by educating and preventing. We have a social and societal responsibility for minors’ access to inappropriate content online, and we’re trying to solve it with heavy blows of the legal arsenal, not by making life better together.
Are there still elements to remember in this Californian legislation?
There is indeed risk analysis upstream of deploying a web service, which can be interesting. But since the results are not intended to be communicated, there is a concern about transparency that worries me. How will it be done? How will the experts be selected?
As for monitoring algorithms, I have the impression that it is impossible to put into practice. How can we imagine platforms allowing access to algorithms they jealously keep secret? I have the impression of a notification effect with this kind of measure, to calm people down, to support them. Except for me, it’s a sleeping pill. How will we really be able to control these things?