AI Act Series: Deployers and Affected Persons (English version)

AI Act Series: Deployers and Affected Persons (English version)

Words matter: Terminology in the (Dutch translation) of the AI Act

In the most recent Dutch version of the AI Act (Rectificatie Text of April 15, 2024), the term “exploitant” is no longer used as the translation of the English term “deployer”. The new translation is “gebruiksverantwoordelijke”.

What is a “gebruiksverantwoordelijke” according to the AI Act? Article 3 defines it in Dutch as “een natuurlijke of rechtspersoon, overheidsinstantie, agentschap of ander orgaan die/dat een AI-systeem onder eigen verantwoordelijkheid gebruikt, tenzij het AI-systeem wordt gebruikt in het kader van een persoonlijke niet-beroepsactiviteit”. In comparison, in the English version “deployer” is defined as “a natural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity”.

I understand well that not many people were satisfied with “exploitant” as a translation of “deployer” from the English text, but apparently, not everyone is satisfied with the new term “gebruiksverantwoordelijke” either.

What other options did we have? Translating “deployer” as “uitvoerder” would also not cover the definition of the AI Act, and in that case, I could understand why the term “gebruiksverantwoordelijke” was chosen.

But I was curious and wanted to explore this further, so besides my legal background I put on my old linguistics hat to try to understand why this term was chosen.

Gebruiksverantwoordelijke in other regulations – My analysis

I have not found the term “gebruiksverantwoordelijke” in other European machinery and safety regulations, but examining the various terms and definitions in these regulations has helped me better understand why the choice of the term “gebruiksverantwoordelijke” was made.

I compared the terms and definitions of the European Machinery Regulation EU 2023/1230 and those of the European Medical Device Regulation (EU) 2017/745 in the three languages I am proficient in (Dutch, English, and Spanish). Here, I will mostly refer to the English and Dutch translations.

In the Dutch version of the European Machinery Regulation EU 2023/1230, the term “professionele gebruiker” is used and defined as “natuurlijke persoon die in het kader van zijn of haar beroepsactiviteit of werk een machine of verwant product gebruikt of bedient”. The English text defines “professional user” as “a natural person who uses or operates machinery or a related product in the course of his or her professional activity or work”. Thus, the term “deployer” is not used, but rather “professional user” and not “gebruiksverantwoordelijke” but “professionele gebruiker”.

In the Dutch version of the MDR (EU) 2017/745, the term “gebruiker” is defined as “een zorgverlener of leek die een hulpmiddel gebruikt”; this is translated from the English version where “user” is defined as “any healthcare professional or lay person who uses a device”.

The difference lies here: in the Machinery Regulation and the MDR, users are defined as end-users, those who actually use the devices. In the AI Act, “deployer” has a broader connotation; it is not only the one who uses the device, but also the one responsible for its use.

And we see this reflected in the number of obligations the “deployer” or “gebruiksverantwoordelijke” must fulfill according to the AI Act. Some of these obligations are:

  • Ensuring an adequate level of AI literacy among their staff and other parties operating and using AI systems on their behalf (art. 4).
  • Recording logs of high-risk AI systems (art. 12).
  • Taking responsibility in the AI value chain (art. 25).
  • Registering the AI system in the EU database (art. 49).
  • Ensuring human oversight, cooperating with relevant authorities, and other requirements as described in Article 26.
  • Assessing the impacts on fundamental rights that the use of high-risk AI systems may entail (art. 27).
  • Complying with transparency obligations (art. 50).
  • Providing clear, substantive explanations to individuals affected by a decision made based on the output of a high-risk AI system (art. 86).

It would not be fair and proportionate to expect an end-user to meet these requirements 😅 Hence, they are intended for someone who is not only responsible for using the AI system but also responsible for the safety of the end-user.

For some, this may sound somewhat similar to the role of data controller under the GDPR. I am curious to hear your thoughts on this.

And then, where can the term “gebruiker” or “eindgebruiker” (“user” and “end user”) be found in the AI Act?

The term is used several times in some of the recitals and in Annex XIII. We can also find terms such as user-friendly and user-interface a few times. Unfortunately, the term is not found under the definitions of Article 3.

It is noteworthy that in the Spanish translation, the term “gebruiksverantwoordelijk” is translated as “responsable del despliegue” but it is also sometimes translated as “usuario” (user) 😕

Why is then the concept of “deployer” or “gebruiksverantwoordelijke” new in the AI Act compared to other European machinery and/or safety regulations?

This may be due to the nature of the technology. The distinction between “user or end user” (“gebruiker of eindgebruiker”) and “deployer” (“gebruiksverantwoordelijke”) reflects the unique characteristics and risks associated with AI systems compared to traditional devices. AI systems are often not off-the-shelf devices; they are integrated into specific contexts and often have a dynamic nature. Unlike static devices, AI systems can evolve over time through learning and adaptation. This can bring new risks and challenges that go beyond what is typical for traditional devices.

Bringing attention to the ones most affected by AI systems

Another thing I noticed is that the term “affected persons” appears 14 times in the English Corrigendum version, while in the Dutch translation we don’t have a unique term for it. In fact, in the Dutch Rectification version, we use the term “betrokken persoon” 12 times and the term “getroffen personen” 10 times. Both of these terms are translations of different terms, as you can see below:

  • 7 times “affected person” is translated as “betrokken persoon
  • 4 times “concerned person or persons” is translated as “betrokken persoon
  • 1 time “population” is translated as “betrokken persoon
  • 7 times “affected person” is translated as “getroffen persoon
  • 3 times “person” is translated as “getroffen persoon

It’s a pity that, just like with the term “gebruiker”, here too a definition of the term “getroffen or betrokken persoon” cannot be found in the AI Act. This is, by the way, also the case in the European Machinery Regulation EU 2023/1230 and the MDR (EU) 2017/745, where terms such as “persoon”, “proefpersoon” and “patiënten” (“person”, “subject” and “patients” in English) can be found, but where only “proefpersoon” (“subject”) has been defined.

I understand that certain terms are unmistakably clear, so much that they’re practically self-explanatory, sparing us the need for further definition. However, employing numerous different terms to convey the same meaning in a legal text can lead to confusion. Therefore, given that the AI Act aims to prevent harm to “affected or concerned persons” by mitigating risks to health, safety, and fundamental rights, I think it would have been beneficial to have a consistent definition of this specific and important term in the definition list of Article 3.

Posted in AI Act