Jump to content

Draft:CEN/CENELEC JTC 21

From Wikipedia, the free encyclopedia

CEN/CENELEC Joint Technical Committee 21 is a standardization committee operated by CEN and CENELEC within the field of Artificial Intelligence. It has the goal of producing standardization deliverables to address European market and societal needs and to underpin primarily EU legislation, policies, principles, and values [1]. That work includes the adoption of international standards, for example, from ISO/IEC JTC 1/SC 42.

The committee is formed of the national standards bodies of the 27 EU countries, the United Kingdom, Macedonia, Serbia, Turkey, Iceland, Norway, and Switzerland [2]. Australia, Canada and Japan are observers to the committee. Civil society participation is primarily through European organisations representing consumers and workers [3]. The international secretariat of JTC 21 is held by Dansk Standards, located in Denmark. The Chair of the committee is Sebastian Hallensleben.

Structure

[edit]

JTC 21 comprises five working groups, each of which carries out specific tasks in standards development within the field of Artificial Intelligence. The working groups of JTC 21[4] are as follows:

Working Group Title Convenor Key projects [5][6][7]
WG 1 Strategic Advisory Group Patrick Bezombes (France)
WG 2 Operational aspects Emilia Tantar (Luxembourg) AI Risk Management, led by Renaud di Francesco (Italy)

AI Conformity assessment framework, led by Emilia Tantar (Luxembourg) and Ansgar Koene (United Kingdom)

Quality management system for EU AI Act regulatory purposes, led by Adam Leon Smith (United Kingdom)

WG 3 Engineering aspects James H. Davenport (United Kingdom) Evaluation methods for accurate computer vision systems, led by Nicolas Cubaude (France)

Treatment of unwanted bias in classification and regression machine learning tasks, led by Adam Leon Smith (United Kingdom) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC 12791

AI system logging, led by Adam Leon Smith (United Kingdom) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC 24970.

Assessment of the robustness of neural networks, led by Arnault Ioualalen (France) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC 24029 (series)

Evaluation methods for accurate natural language processing systems, led by Rania Wazir (Austria) and Lauriane Aufrant (France) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC 23282

Quality and governance of datasets in AI, led by Michael Van Hartskamp (Netherlands)

Data terms, measures and bias requirements, led by Michael Van Hartskamp (Netherlands)

WG 4 Foundational and societal aspects Enrico Panai (France) Under Approval

Environmentally sustainable Artificial Intelligence, led by Valerie Livina (United Kingdom); Project reference prCEN/CLC/TR 18145 (WI=JT021010)

Transparency taxonomy of AI systems, led by Rania Wazir (Austria) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC 12792; Project reference prEN ISO/IEC 12792 (WI=JT021022)

Under Drafting

AI trustworthiness framework, led by Piercosma Bisconti (Italy); Project reference prEN XXX (WI=JT021008)

Competence requirements for AI ethicists professionals, led by Alessio Tartaro (Italy); Project reference prEN XXX (WI=JT021019)

AI-enhanced nudging, led by Enrico Panai (France) in collaboration with ISO/IEC JTC 1/SC 42 as ISO/IEC AWI 25029; Project reference prEN XXXXX (WI=JT021006)

Preliminary works

Guidance for upskilling organisations on AI ethics and social concerns, led by Juan Pablo Peñarrubia (Spain); Project reference prCEN/TS (WI=JT021033)

Guidelines on tools for handling ethical issues in AI system life cycle, led by Juan Pablo Peñarrubia (Spain); Project reference prCEN/TS (WI=JT021034)

Sustainable Artificial Intelligence – Guidelines and metrics for the environmental impact of artificial intelligence systems and services, led by Juliette Fropier (France); Project reference prCEN/TS (WI=JT021035)

Impact assessment in the context of the EU Fundamental Rights, led by Violeta Mezeklieva (France); Project reference prCEN/CLC/TR XXX (WI=JT021026)

WG 5 Joint standardization on Cybersecurity for AI systems Annegrit Seyerlein-Klug (Germany) Technical solutions to address AI specific vulnerabilities

EU AI Act

[edit]

In December 2022 the European Commission issued a draft standardisation request to CEN/CENELEC[8]. The standardisation request was a draft as the AI Act had not entered into force when the request was issued. An updated standardisation request is reportedly under preparation [9].

The standardisation request asked for standards relating to risk management, quality management, data quality and governance, oversight, transparency, accuracy, robustness, cybersecurity and conformity assessment. The committee developed an architecture of standards in response which it is in the process of fulfilling[10][11].

The standardisation request required the committee to produce standards by April 2025. However, the Chair of the committee said in August 2024 that the exact final date is unknown and that the committee is working towards producing relevant standards by the end of 2025[12].

In September 2024, the Chair also posted a list of the work items in the committee on LinkedIn. However, it is unclear whether these will all be relevant to the final set for the AI Act [7].

A technical standard recognized under EU law is called harmonised standards. They provide a "presumption of conformity” as the companies adhering to them are assumed to comply with the specified legal requirements if they undergo conformity assessment. While they are voluntary, the harmonised standards (see Harmonization (standards)) are used heavily by industry as they lower compliance costs. This critical role pushed some to believe that the standardization process where the real rule-making of the AI Act will occur[13].

Controversy

[edit]

Inclusiveness and industry lobbying

[edit]

CEN/CENELEC JTC 21 is overwhelmingly dominated by the private sector, including representatives from the large US tech companies (sometimes through their European offices, for example, in Ireland). They have been accused of setting weak, industry-friendly standards, on issues like corporate social responsibility [14], and preferring to interpret new legal requirements as supporting existing norms, despite the intent of the legislator[15]. The primary response to this by the European Commission and CEN/CENELEC is to increase the inclusivity of the process.

Fundamental rights

[edit]

While standards have been used for many years for product safety, the AI Act also focuses on fundamental rights. Some academics believe that the value-laden nature of the AI Act might plant a constitutional bomb under the New Legislative Framework [13].

Others point to the technical complexity of integrating fundamental right in the standards ecosystem in the context of risk management [16].

Adoption of international standards

[edit]

International standards can be adopted in Europe or referenced in a way that takes account of European specificities, such as fundamental rights-related obligations. The principle of international first is a key part of the standardisation ecosystem.

There are stakeholders who are concerned about reliance on international groups. Consumer and civil society groups, which have an official position in the European process argue they are more easily sidelined at the international level [17].

In May 2024 the European Commission held a public webinar explaining why key international standards like ISO/IEC 42001 and ISO/IEC 23894 were not directly suitable [18].

In September 2024, media reporting cited two "extremist factions" within JTC 21 [19]. In particular, some tech companies argued that international standards should be automatically used to demonstrate compliance with the AI Act, whereas some civil society organizations have rejected using international standards to inform the EU process altogether.

Quality

[edit]

JTC 21 has the challenge of trying to get the standards ready before the law is enforceable, while also ensuring they are of sufficient quality. The European Commission gets final sign-off and could decide not to harmonise standards to provide presumption of conformity [17]. Some believe it is unlikely that the committee will be able to produce standards of sufficient quality to satisfy regulators [20].

References

[edit]
  1. ^ "CEN/CLC/JTC 21 - Artificial Intelligence".
  2. ^ "About CEN".
  3. ^ "CIVIL SOCIETY: IMPROVING, STRENGTHENING AND LEGITIMISING THE EUROPEAN STANDARDIZATION SYSTEM" (PDF).
  4. ^ "CEN/CLC/JTC 21 Subcommittees and Working Groups".
  5. ^ "Artificial Intelligence standardisation Inclusiveness Newsletter, Edition 1" (PDF).
  6. ^ "CEN/CLC/JTC 21 Work programme".
  7. ^ a b "Sebastian Hallensleben, LinkedIn post".
  8. ^ "Draft standardisation request to the European Standardisation Organisations in support of safe and trustworthy artificial intelligence".
  9. ^ "ETUC Artificial Intelligence standardisation Inclusiveness Newsletter, Edition 4" (PDF).
  10. ^ "Status of JTC 21 Activities in Response of the Standardization Request on AI" (PDF).
  11. ^ European Commission. Joint Research Centre. (2023). Analysis of the preliminary AI standardisation work plan in support of the AI Act 2023Technical reportsInformation society Standards. Publications Office. doi:10.2760/5847. ISBN 978-92-68-03924-3.
  12. ^ "Deadline for AI standards to be postponed, EU standards chief says".(subscription required)
  13. ^ a b Veale, Michael; Zuiderveen Borgesius, Frederik (2021). "Demystifying the Draft EU Artificial Intelligence Act — Analysing the good, the bad, and the unclear elements of the proposed approach". Computer Law Review International. 22 (4): 97–112. arXiv:2107.03721. doi:10.9785/cri-2021-220402.
  14. ^ "THE LOBBYING GHOST IN THE MACHINE" (PDF).
  15. ^ "Inclusive AI governance".
  16. ^ "AI and Product Safety Standards Under the EU AI Act".
  17. ^ a b "Comment: Success of EU's AI Act hinges on standards process that's already causing headaches".(subscription required)
  18. ^ "1st European AI Office webinar on Risk management logic of the AI Act and related standards". 13 May 2024.
  19. ^ "Comment: EU's AI standards adoption falters over international alignment, delayed timeline".(subscription required)
  20. ^ "The EU's AI Act Is Barreling Toward AI Standards That Do Not Exist".