Ongoing and past projects

ANEC AI and consumers Task Force 2025

Standards play a crucial role in the implementation of the European AI Act, providing presumption of conformity the legal requirements and ensuring that high-risk AI systems respect fundamental human and consumer rights.

Strong consumer representation in the standardisation process is essential to ensure that AI systems reflect the diversity of our societies. Standards based on consumer protection, safety and accessibility are crucial to deliver trustworthiness to consumers.

ANEC is working to strengthen the capacity of consumer and civil society advocates, helping to make their voices heard in AI standardisation processes. ANEC supports the involvement of consumer and civil society experts in CEN-CENELEC Joint Technical Committee 21 (JTC 21), which focuses on AI standards.

Thanks to funding from the European AI & Society Fund, ANEC set up an AI and consumers Task Force composed of 9 consumer experts joining from 8 different countries (Austria, Germany, Cyprus, Bulgaria, Netherlands, Greece, Spain and Italy). Its members had the kick-off meeting in Brussels on 28-29 April 2025, received training on AI standardisation and exchanged with representatives from the EC’s AI Office and CEN- CENELEC Management Center.

The AI Task Force will remain active throughout the year at national and European level, follow-up and contribute to the AI standards work in JTC21.

         EAISF           AI_Taskforce_28_April_2025.jpg            20250429 104631

 

Mercator Project

Previously, ANEC and BEUC lead a project to get consumers’ interests taken on board in standards for Artificial Intelligence (AI). This project was financed by the Mercator Foundation (2022-2024). 

Why is AI standardisation so important and why should you engage in it? 

The future AI Act will be the first-of-its-kind legislation regulating AI systems. One of its objectives is that AI systems placed on the EU market respect fundamental rights, including the right to non-discrimination, data protection, consumer protection and protection of children. 

The proposed regulation gives a prominent role to technical standards, in particular harmonised standards[1]. If an AI system complies with harmonised standards, it will benefit from a presumption of conformity with the legal requirements. Thus, there will be a strong incentive for AI companies to rely on these harmonised standards to prove compliance with the AI Act.

It is essential to bring the voice of civil society to the table to ensure that the technical standards will comprehensively protect consumers and society at large.

This is all the more important as the proposed AI Act remains vague on many concepts relating to fundamental rights. For instance, it does not define what an unwanted bias is. As they translate the AI Act’s requirements into technical requirements, standardisation bodies will have to fill in the gaps and answer these questions. 

As part of our project, we want to give the tools to civil society actors to get involved in the standardisation of AI systems. 

ANEC already takes part in the drafting of AI standards at the European level. ANEC is represented there by experts that defend the consumer viewpoint in standardisation working groups. BEUC supports ANEC in its efforts to make the European standardisation system more inclusive, especially for civil society stakeholders.

We cannot act alone, and we invite civil society organisations to join us in promoting the voice of civil society in AI standardisation. 

How do I engage in AI standardisation? Choose your path

In this section, we give you the essential information to engage in AI standardisation. We also recommend you follow this quick and interactive free course on the basics of the European standardisation system. 

The European Commission sent the standardisation request on AI to the European Committee for Standardisation (CEN) and the European Committee for Electrotechnical Standardisation (CENELEC). These bodies are European standardisation organisations, i.e., they are official providers of European standards. CEN-CENELEC JTC 21 “Artificial Intelligence” is where the standards implementing the AI Act are being discussed. CEN-CENELEC JTC 21’s work is divided into working groups (WGs) covering specific aspects of AI systems.

 

3 reasons for civil society to get involved in AI standardisation

  • Currently, the industry sends the vast majority of experts drafting AI standards. Civil society experts are few. A better balance must be found. 
  • Engaging in standardisation is a concrete way of having an impact on the implementation of the AI Act. Important decisions on definitions (e.g., “unwanted bias”), privacy, accuracy and transparency take place in the standardisation working groups. 
  • Getting involved in AI standardisation opens the door to a new world that is highly relevant for civil society and is gaining more and more importance in the political discussion at the EU level due to the geopolitical relevance of standardisation.

For further reading on the AI Act:

Stay in touch

ANEC will be happy to answer any questions you might have, feel free to contact us.  

Please stay in touch with us:

ANEC newsletter

ANEC Twitter / ANEC LinkedIn

BEUC Twitter / BEUC LinkedIn

[1] Harmonised standards are European standards adopted on the basis of a request made by the European Commission for the application of EU legislation.

[2] National Standardisation Bodies (or National Committees) set up national mirror committees to discuss the drafting of standards at the European level and form a national position on them. National mirror committees exist on all kinds of topics, but the AI mirror committee is the relevant one here for the purpose of influencing the implementation of the AI Act.

 

Group 8