Clarity for the Automated Vehicle Ethics Debate

Split second decisions made behind the wheel of a car can, in rare cases, result in the injury or death of passengers or bystanders. Up until now, humans faced those decisions. But as automated vehicles (AVs) take to the roads, they will inevitably face similar choices.

When a human driver is involved in an unavoidable crash, they usually get the benefit of the doubt that they acted in society’s best interest. However, AVs will have programmable code that could effectively predetermine the actions and outcome in a split second crash situation.

Experts constantly go back and forth on how AVs should react in such situations, and who should be responsible for overseeing the “decision-making” code. A new report from Germany takes the issue of AV ethics head on and provides clear guidelines for tough ethical situations. It also provides recommendations for how governments and private industry should approach them.

Over the past nine months, the German Federal Transport Ministry organized an Ethics Commission to examine some of the most complex aspects of the ethics that surround connected and automated vehicles. In June, the Commission released a report that summarizes their findings. It takes a strong stance and answers many of the questions we have about AVs here in the U.S.

The report lists 20 primary points related to ethics and AVs, some of which are obvious but essential to the ethics debate. For example:

  • The safety of human lives should always be the primary goal of AV systems
  • Designers and programmers should emphasize avoiding crashes altogether
  • AV developers have the responsibility to design the system to avoid security breaches
  • Technology and auto firms need to be transparent and proactively educate consumers and the general public about what their technology can and cannot do

Of these 20 points, three address some of the more nuanced ethics issues, and are relevant to the current AV policy debate in the Unites States.

The public sector is responsible for guaranteeing safety.

The German Ethics Commission states that the public sector’s role is to ensure that automated and connected vehicle technologies are safe. The driving systems “need official licensing and monitoring,” similar to what Eno recommended in a recent report, Beyond Speculation. The Commission emphasized that the company developing a car’s software system is liable for anything that goes wrong with it, as is already the case for product liability. In the report, “licensing” refers to what we call technology “certification” here in the U.S. This is necessary for the public sector to oversee and ensure the safety of these vehicles.

AV developers must clearly distinguish the responsible party in a “hot potato” situation.

Automated vehicles are organized into 6 different levels based on how much a computer or human is controlling the vehicle. Level 2 and 3 automated vehicle systems do most of the driving task, but rely on human drivers to take control in certain situations. In these cases, the human driver must either be constantly watching the road (in Level 2) or present in the driver seat (Level 3) and ready to take over control of the vehicle.

However, the amount of time before the system shifts control of the car back to the human (or vice versa) is not well defined. The Commission report states that the car must be designed so that it is clear and apparent which side is responsible for driving and what those responsibilities include.

For higher levels of automation (levels 4 and 5) there should be no need for quick shift from computer to human. And in emergency situations, the car must be able to bring itself to a “safe condition.”

The trolley dilemma does not apply to AVs.

The trolley dilemma arises when the AV system’s programmers make an explicit decision to save one (or more) human life over another. The debate on how to respond, either by saving the car passengers or minimizing risk to overall human life, has been a primary subject of debate over the past few years.

The Commission’s report puts this problem in context. It ultimately concludes that the scenarios in which an AV would have to make such a choice are so complex and unpredictable that there is no way for ethical decisions to be standardized in programming code or regulation.

Even if ethicists could come up with a definitive answer to the dilemma, it would be futile to program it. AV developers should instead focus all attention on avoiding crashes. In the rare instance where there is a crash with an unavoidable loss of life, an independent or governmental body should evaluate the situation and determine the lessons learned.

As we begin to rely more and more on technology in our vehicles, it is important that we carefully consider the implications of automating the driving task. Germany’s Ethics Commission concisely and thoughtfully addresses some of the most challenging ethical debates that have arisen so far. Here in the United States, we do not need to reinvent the ethical wheel – instead, we should take heed from countries such as Germany that are already sorting out these challenging issues.

 

Search Eno Transportation Weekly

Latest Issues

Happening on the Hill