Insights

EU_Strikes_Political_Deal_on_Landmark_Artificial_

EU Strikes Political Deal on Landmark Artificial Intelligence Act

On December 9, 2023, the European Parliament and the Council of the European Union ("EU") reached a political agreement on the Artificial Intelligence Act ("AI Act") proposal, the first-ever comprehensive legal framework for AI worldwide.

The AI Act aims to guarantee that AI systems placed on the European market and used in the EU are safe and respect fundamental rights and EU values. By taking a risk-based approach, it seeks to achieve a balance that would foster customer trust, as well as investment and innovation in the field of AI within Europe.

The AI Act has an extraterritorial reach, applying to AI providers regardless of their location, users within the EU, and providers and users outside the EU when the output produced by the system is used in the EU.

In summary, the AI Act will, if enacted:

  • Prohibit certain AI systems, such as cognitive behavioral manipulation, emotion recognition used at the workplace, and social scoring by governments or companies;
  • Impose strict requirements on high-risk AI systems, including risk-mitigation systems, high quality of data sets, activity logs, detailed documentation, clear user information, human oversight, and a high level of robustness, accuracy, and cybersecurity;
  • Outline transparency obligations on other AI systems. For instance, when employing AI systems, such as chatbots or "deep fakes," companies should inform users that they are interacting with a machine; and
  • Introduce specific rules for general-purpose AI models and foundation models in order to guarantee transparency. General-purpose AI models will have to comply with requirements on technical documentation, EU copyright law, and rules regarding content used for training. Powerful models that could pose systemic risks will have to comply with additional obligations related to managing risks and monitoring serious incidents, performing model evaluation, and adversarial testing.

The AI Act would establish a governance structure, with an AI Office within the European Commission tasked to oversee general-purpose AI models, participating in the development of standards, and testing practices and enforcing the common rules across EU Member States. National competent authorities would oversee AI systems and gather in the AI Board, which would act as a coordination platform and advisory body to the European Commission.

The fines for violations of the AI Act could reach up to €35 million or 7% of the company's global annual turnover.

The political agreement will now have to be formally approved by EU legislators. It does not require national implementing measures and will start applying directly after a two-year transition period. However, the prohibition of certain AI systems will already apply after six months and the rules on general-purpose AI after 12 months. A number of "harmonized standards" for translating the legal requirements of the AI Act into specific technical requirements are also likely to be developed in advance of the date of application.

Following the formal adoption of the AI Act, the European Commission will launch an AI Pact aimed at bringing AI developers from Europe and across the globe to voluntarily commit to key obligations of the AI Act before the legal deadlines.

Insights by Jones Day should not be construed as legal advice on any specific facts or circumstances. The contents are intended for general information purposes only and may not be quoted or referred to in any other publication or proceeding without the prior written consent of the Firm, to be given or withheld at our discretion. To request permission to reprint or reuse any of our Insights, please use our “Contact Us” form, which can be found on our website at www.jonesday.com. This Insight is not intended to create, and neither publication nor receipt of it constitutes, an attorney-client relationship. The views set forth herein are the personal views of the authors and do not necessarily reflect those of the Firm.