Insights

Driverless, Networked Vehicles on the Rise, French Liability Regulations Lag Behind

In Short

The Situation: Autonomous cars with incorporated artificial intelligence ("AI") are now a reality whereas French regulations have yet to adjust.

The Issue: The phenomenon of new autonomous cars using AI gives rise to questions about how product liability principles will apply and adapt thereto.

Looking Ahead: Carmakers should already be considering what liability risks could be created by incorporating AI in autonomous cars and how to mitigate such risks.


During the 2018 Paris Motor Show, consumers and professionals had the chance to see the most recent models of autonomous and connected vehicles. Automobiles are increasingly using AI and, in particular, Machine Learning to make decisions. As vehicles using Advanced Driver Assistance Systems ("ADAS") are already on the market in France, carmakers, engineers and programmers are working on the development of more and more autonomous vehicles. However, the answer to the question as to how liability rules would apply in case of damages or injuries suffered in accidents (or car crashes) involving cars controlled, in whole or in part, by AI remains unclear.

Different regulations must be taken into account when it comes to damages or injuries suffered in accidents (or car crashes) involving cars controlled, in whole or in part, by AI.

Product Safety

In France, car producers are subject to product safety regulations provided by articles L. 421‑1 et seq. of the French Consumer Code, implementing the EU Directive 2001/95/EC of December 3, 2001 on general product safety ("GPSD").

Under this regulation, only safe products must be placed on the market. A "safe product" is defined as any product which, under normal or reasonably foreseeable conditions of use, does not present any risk or only the minimum risks compatible with the product's use. In this respect, the producer who knows or ought to know that the product presents risks to consumers that are incompatible with the general safety requirement must (i) immediately inform competent authorities - which may then notify all other Member States and the EU Commissions through the Community Rapid Information System RAPEX – and (ii) take appropriate actions, including, if necessary, withdrawal from the market, warning consumers or issuing a recall.

Although no specific provision on AI is contained in the French Consumer Code nor in the GPSD, producers of autonomous and connected vehicles should be prepared to demonstrate  - as the case may be - the safety of their cars, taking into account the specific risks associated with the AI incorporated in their products, in order to limit their liabilities. In this respect, the risk assessment methodology provided by the European Commission (Decisions 2004/905/EC and 2010/15/EU) may be used, provided that it is adapted to the specificity of the AI.

Product Liability

French product liability law is governed - like all product liability regulations in EU Member States - by the Product Liability Directive 85/374/EEC ("Directive"). The French civil code (art. 1245 et seq) provides that a "producer" of goods is liable for damages caused by its product, regardless of the existence of a contract.

There is no real doubt that autonomous vehicles will be considered "products" in the meaning of the Directive. Concerning the AI itself and the software incorporated into the vehicle - which might be at the origin of the damage - they are also likely to be considered "products" (even though there is no clear case law to date on this issue).

A cause of action can be brought against both the "producer" (the manufacturer of the car) - which is broadly defined - and/or the "producer" of a specific component part. In that respect, software designers might be classified as 'producers'.

Product liability regulations could therefore be applied against software designers in the event of a security deficiency caused by the AI. However, uncertainties still remain as product liability regulations are not perfectly adapted to AI issues and to technological change in general.

First, product liability regulations require a preexisting "defect" that was known to the "producer" prior to the placement of the product on the market. Since AI can evolve (i.e. software updates, machine learning processes), it is still unknown whether courts will find that car manufacturers would be entitled to an exemption of liability in such a case.

Second, it is uncertain whether the car manufacturer or software developer will be able to hide behind the "development risk". Indeed, a manufacturer may be exempted from liability if it was impossible to discover the "defect" at the time the product was put on the market given the state of scientific and technical knowledge then available.

While product liability regulations are a strong tool for the protection of victims, many uncertainties still exist concerning how they will apply to products using AI. In particular, other liability regulations should apply in cases where the car accident was not caused by an inherent defect but rather by a choice that the autonomous car had to make in a critical situation.

In the United States, traditional tort and product liability laws will likely apply to AI.  However, no federal regulations currently exist to guide the country, leaving interpretation of liability arising therefrom to the product liability laws of the various states within the United States. Product liability law in the United States generally provides remedies for personal injury or property damage, but the breadth of litigation arising therefrom can include manufacturers of finished products, AI manufacturers, or even those who supply components of finished AI. Theories of recovery can include strict liability, negligence, misrepresentation and breach of warranty.

Risks of Cyber-Attacks

As autonomous vehicles become more and more connected and sophisticated with respect to their software components, autonomous vehicles will need to process a significant amount of data and communicate with a large number of service providers (for navigation, infotainment or maintenance services, for instance). As a consequence, there is a growing risk of "cyber-attacks" and "hacking" of the software components of the vehicle - possibly leading to accidents in the worst cases.

As cybersecurity gradually becomes a key aspect of the vehicles, it seems manufacturers should define a strategy to monitor and control the level of cybersecurity of their connected and/or autonomous vehicles - even before regulatory obligations in terms of cybersecurity are imposed. That being said, there is currently no clear answer as to how product liability law can be adapted to face these new risks, and in particular, how the implementation of cybersecurity would allow producers to avoid liability in case of an attack.

Commission report on the Directive and Expert Group

The European Commission published a report dated May 7, 2018 regarding the Directive, proposing that certain legal terms be clarified (including "product", "producer", "defect"). The Commission has also launched an expert group on liability and new technologies. The first meeting was held in June 2018 (see our last Jones Day Commentary on "Muddy Road Ahead: European Liability Legislation Remains Unclear for Autonomous Vehicles"). A meeting was held on September 24, 2018 and the issue of autonomous cars was discussed. No reports or minutes from the Expert Group have been published yet but discussions regarding possibly maintaining the current national legislations while taking into account the aforementioned technological changes were held.

'Badinter Law' in case of road traffic accidents

In France, a law dated July 5, 1985 ('Badinter' Law no. 85-677) governs the liability of drivers and "gardiens" (keepers of the vehicle) for road traffic accidents. The Badinter Law was enacted in order to facilitate the compensation of victims. It is likely that this Law will be applicable to autonomous vehicles as it requires only three conditions: a victim, a road-traffic accident and a motor vehicle. However, this law is not perfectly suited for autonomous vehicles and in particular for driverless cars (level 5 of autonomous vehicles). Indeed, the Badinter Law uses the notions of drivers or "gardiens" (which can be construed as the one controlling and directing the vehicle) to determine who would be liable. In this regard, it is still unclear who acts as the "driver" or the "gardien" in a driverless car.

Many lawyers call for a reform of the Badinter Law. The creation of a public compensation fund in case of accident is a solution contemplated by many.


Two Key Takeaways:

  1. The evolving notions of "producer", "product", "defect" and "gardien" under French law will be essential for carmakers and software providers in their efforts to mitigate liability risks.
  2. Carmakers and software providers should stay alert for further developments and make their voices heard as far as necessary. A watchful eye should be kept on the works of the Commission and the Expert Group as European developments might impact French legislation.

Lawyer Contacts

For further information, please contact your principal Firm representative or the lawyers listed below. General email messages may be sent using our "Contact Us" form, which can be found at www.jonesday.com/contactus/.

Ozan Akyurek
Paris
+33.1.56.59.39.39
oakyurek@jonesday.com

Sophie Hagège
Paris
+33.1.56.59.39.39
shagege@jonesday.com

Francoise Labrousse
Paris
+33.1.56.59.39.48
flabrousse@jonesday.com

Olivier Haas
Paris
+33.1.56.59.38.84
ohaas@jonesday.com  

Elie Kleiman
Paris
+33.1.56.59.36.39
ekleiman@jonesday.com

Jean-Gabriel Griboul
Paris
+33.1.56.59.38.92
jggriboul@jonesday.com  

Philippe Goutay
Paris
+33.1.56.59.39.39
pgoutay@jonesday.com  

Robert Kantner
Dallas
+1.214.969.3737
rwkantner@jonesday.com  

Craig Waldman
San Francisco / Silicon Valley
+1.415.875.5765 / +1.650.739.3939
cwaldman@jonesday.com  

Elodie Simon
Paris
+33.1.56.59.39.50
elodiesimon@jonesday.com  

Clémence de Perthuis
Paris
+33.1.56.59.39.92
cdeperthuis@jonesday.com

Jones Day publications should not be construed as legal advice on any specific facts or circumstances. The contents are intended for general information purposes only and may not be quoted or referred to in any other publication or proceeding without the prior written consent of the Firm, to be given or withheld at our discretion. To request reprint permission for any of our publications, please use our "Contact Us" form, which can be found on our web site at www.jonesday.com. The mailing of this publication is not intended to create, and receipt of it does not constitute, an attorney-client relationship. The views set forth herein are the personal views of the authors and do not necessarily reflect those of the Firm.