If a fully autonomous vehicle kills someone, who is responsible? If there is one thing you learn early on in your business career, it's the law, your liability, and understanding your exposure. It doesn't matter what business you're in; if you have clients or employees, there is a solid chance you are liable for something if and when things don't work out.
When "you" drive a car and hit something or even kill someone, there is no doubt as to who is legally liable for the damage or death. Sure, there are different circumstances that will impact the degree to which you are held liable for the accident, but there is no doubt that the liability is attached to you and only you. But what happens when there isn't a driver in the car? What happens when there is only a machine learning black box operating the vehicle? Well, it gets a bit sticky.
There have been two crashes that have been on the headlines lately. One is the Tesla which crashed while on autopilot, and the other is an Uber Self Driving Car that killed a pedestrian. In both cases, the liability of the crash falls purely on the drivers who were in the car. That's how the law works. Sure, there will be extenuating circumstances that will determine to what degree the driver was responsible as the system had primary control, but there is no ambiguity in determining who is attached to the accidents John Doe/Jane Doe. Put another way, the humans in the car.
What we think is a more critical issue is determining who is responsible when no one is behind the driver's wheel, and only passengers or cargo are on board. What happens when the driverless system kills the passengers or kills others outside the car?
These are some of the topics that we will be sharing our thoughts on, and we encourage you to reach out to us with your thoughts on this series of Liability and Automation. Agree, disagree, suggestions are all welcome.
Comments