The applications of algorithms in the public sector are many and varied, both in terms of sector of activity (education, human resources, taxation, health, employment, public safety, etc.), purpose of processing (calculating entitlements, predicting risk, matching, targeting, etc.) and methods of intervention: these algorithms are used to document and guide public decisions, and increasingly to "automate" them.
Algorithms in the public sector have their own specific features compared to those developed and used by the private sector: they are supposed to operate in the service of the general interest, they are sometimes used to enforce the law, and are often unavoidable.
Their use thus raises important questions in terms of accountability, understanding and equality of public action.
In 2016, the Law for a Digital Republic introduced transparency requirements into the Code of Relations between the Public and the Administration for administrations that use algorithmic processing to base their decisions(Article 312-1-3). More specifically, administrative authorities may only make individual administrative decisions exclusively on the basis of an algorithm under certain conditions. When these decisions are taken, citizens will benefit from enhanced information, and will be able to appeal against them.
In a circular, the Prime Minister reminded us in April 2021 that administrations "must constantly seek the best circulation of data, algorithms and codes, in open formats that can be exploited by third parties".
In 2021, France has strengthened this commitment by planning to draw up monographs of public algorithms and to support two ministries and two local authorities in their efforts to inventory public algorithms as part of the Open Government Partnership.
" The use of algorithms by public administrations remains opaque, " observe Camille Girard-Chanudet, sociologist and postdoctoral researcher at the Centre d'Études de l'Emploi et du Travail (CNAM), Estelle Hary, designer and independent researcher, and Soizic Pénicaud, consultant and independent researcher on the effects of artificial intelligence on human rights.
" In the rare cases where these tools are communicated, the information disseminated is often incomplete or difficult to access. Only a few administrations have published inventories of the algorithms they use, and most of these are incomplete... This lack of transparency undermines democratic control over the activities of public administrations. It deprives citizens of a global vision of the use of algorithmic tools in public services: in which administrations are algorithms used? With what objectives? According to what methods? In what organizational contexts? Are they developed by private service providers? How much do they cost? What impact do they have on the target audiences (individuals, companies)? Without clear information, it's impossible to understand, analyze and, if need be, criticize the use administrations make of these algorithmic tools.
This observation echoes that made by the Conseil d'Etat in its 2022 report on artificial intelligence. It observes that the obligation for administrations employing at least 50 agents or employees (in FTEs) to publish spontaneously online the rules defining the " main algorithmic treatments" they use, when they form the basis of individual decisions "is not accompanied by any sanction, and its effectiveness is doubtful in practice".
The Inventory of public algorithms: a tool for transparency
It was in response to this "information vacuum" that Camille Girard-Chanudet, Estelle Hary and Soizic Pénicaud, launched a citizen inventory of public algorithms.
Identify the algorithms used by government agencies, centralizing the information so that it can be of use to citizen, professional, activist or research associations and groups.
Show that algorithms are neither neutral nor autonomous, highlighting the importance of the choices and contexts that influence their operation.
In its first version (November 2024), the inventory takes into account :
All types of algorithmic systems, whether based on machine learning, deep learning, systems of rules (e.g. calculation files), or other technologies, and whether or not they are involved in decision-making.
Algorithms developed or used by central administrations and state agencies, excluding, for the time being, local authorities and the hospital civil service.
Algorithms that are publicly documented, either in government documents or by third parties.
In her November 2024 report on algorithms and AI systems in public services, theDéfenseuredes droits welcomes the establishment of local and national citizen observatories, including the Observatoire des algorithmes publics (ODAP) project, " which must not, however, overshadow the responsibility of the administrations concerned in this respect".
The observatory of public algorithms: towards greater transparency in public action?
Public algorithms
The applications of algorithms in the public sector are many and varied, both in terms of sector of activity (education, human resources, taxation, health, employment, public safety, etc.), purpose of processing (calculating entitlements, predicting risk, matching, targeting, etc.) and methods of intervention: these algorithms are used to document and guide public decisions, and increasingly to "automate" them.
Algorithms in the public sector have their own specific features compared to those developed and used by the private sector: they are supposed to operate in the service of the general interest, they are sometimes used to enforce the law, and are often unavoidable.
Their use thus raises important questions in terms of accountability, understanding and equality of public action.
In 2016, the Law for a Digital Republic introduced transparency requirements into the Code of Relations between the Public and the Administration for administrations that use algorithmic processing to base their decisions(Article 312-1-3). More specifically, administrative authorities may only make individual administrative decisions exclusively on the basis of an algorithm under certain conditions. When these decisions are taken, citizens will benefit from enhanced information, and will be able to appeal against them.
In a circular, the Prime Minister reminded us in April 2021 that administrations "must constantly seek the best circulation of data, algorithms and codes, in open formats that can be exploited by third parties".
In 2021, France has strengthened this commitment by planning to draw up monographs of public algorithms and to support two ministries and two local authorities in their efforts to inventory public algorithms as part of the Open Government Partnership.
" The use of algorithms by public administrations remains opaque, " observe Camille Girard-Chanudet, sociologist and postdoctoral researcher at the Centre d'Études de l'Emploi et du Travail (CNAM), Estelle Hary, designer and independent researcher, and Soizic Pénicaud, consultant and independent researcher on the effects of artificial intelligence on human rights.
" In the rare cases where these tools are communicated, the information disseminated is often incomplete or difficult to access. Only a few administrations have published inventories of the algorithms they use, and most of these are incomplete... This lack of transparency undermines democratic control over the activities of public administrations. It deprives citizens of a global vision of the use of algorithmic tools in public services: in which administrations are algorithms used? With what objectives? According to what methods? In what organizational contexts? Are they developed by private service providers? How much do they cost? What impact do they have on the target audiences (individuals, companies)? Without clear information, it's impossible to understand, analyze and, if need be, criticize the use administrations make of these algorithmic tools.
This observation echoes that made by the Conseil d'Etat in its 2022 report on artificial intelligence. It observes that the obligation for administrations employing at least 50 agents or employees (in FTEs) to publish spontaneously online the rules defining the " main algorithmic treatments" they use, when they form the basis of individual decisions "is not accompanied by any sanction, and its effectiveness is doubtful in practice".
The Inventory of public algorithms: a tool for transparency
It was in response to this "information vacuum" that Camille Girard-Chanudet, Estelle Hary and Soizic Pénicaud, launched a citizen inventory of public algorithms.
Identify the algorithms used by government agencies, centralizing the information so that it can be of use to citizen, professional, activist or research associations and groups.
Show that algorithms are neither neutral nor autonomous, highlighting the importance of the choices and contexts that influence their operation.
In its first version (November 2024), the inventory takes into account :
All types of algorithmic systems, whether based on machine learning, deep learning, systems of rules (e.g. calculation files), or other technologies, and whether or not they are involved in decision-making.
Algorithms developed or used by central administrations and state agencies, excluding, for the time being, local authorities and the hospital civil service.
Algorithms that are publicly documented, either in government documents or by third parties.
In her November 2024 report on algorithms and AI systems in public services, theDéfenseuredes droits welcomes the establishment of local and national citizen observatories, including the Observatoire des algorithmes publics (ODAP) project, " which must not, however, overshadow the responsibility of the administrations concerned in this respect".