top of page

Tesla Autopilot Fatality

The first reported death while using Tesla's autopilot feature occurred in Florida. News reports indicate the the autopilot feature failed to recognize a semi-trailer that had made a left turn and caused a high speed rear end impact with the trailer. The car went underneath the trailer resulting in the death of the occupant. The news further reported a Harry Potter DVD playing shortly after the crash. Tesla's so-called "driverless car", equipped with cameras and radar, never engaged the brakes.

Tesla released the following image of its Autopilot interface on its website.

Tesla Autopilot car accident

There are major problems with "driverless car" features may never be good enough to trust your life to. As a lawyer, I have a hard time trusting a corporate automobile manufacturer to build a product that is going to keep me safe under every possible scenario encountered on the roadway. While drivers are not perfect, the manufacturers of these "driverless cars" are also human and cannot be perfect. At this point, we should take a minute or two to talk about the concept of a driverless car in the first place.

A driverless car is a car that is essentially autonomous. You give a set of instructions to the car and the car executes them. This means that the car keeps itself on the road and avoids other traffic. As this technology improves, the "driver" in the car does less and less until there is virtually nothing to do. As the "driver" does less and less to control the car, they are going to do other things. The problem with this is that the more the car does, the more likely the driver is to be distracted (because there is less to do), and accidents will happen because the drivers do not re-take control of the vehicle from the computer when they should. This is likely to create a battle between large automobile manufacturers (who are probably inadequately equipped to insure all the vehicles on the road) against drivers who are privately insured (at least for the time being). A malfunction of the vehicle that is not due to the negligence of the owner is a product liability claim against the manufacturer as opposed to a liability action against the driver. If the driver operates the vehicle correctly according to manufacturer instructions, there should not be liability for the driver. On the other hand, the instructions provided to a "driver" or the computer controlling the car could have design or manufacturing defects in them. In order for a driver to be held civilly liable, it must be proved that the driver was negligent or did something wrong. An "autonomous" vehicle would essentially eliminate liability for accidents by people and would take it to the realm of corporate responsibility. We all know that a corporation's main purpose is to make money rather than to insure safety. Large manufacturer's have notoriously made decisions in the past based on the business's exposure to liability as opposed to making a better product. This is precisely what happened many years ago with the Ford Pinto. Rather than do a recall and fix the fuel tank of every Pinto, Ford decided that it would cost less to pay civil judgments as they occurred rather than stop putting people's lives at risk. In short order, corporate America (or whatever country the vehicle is made in) cannot be trusted with the safety of the public in a way that gives people a reason to turn a blind eye toward responsibility. With regard to the Tesla accident described above, if the driver had been paying attention to the roadway as opposed to doing something else, then the driver likely would have seen the semi-truck across the roadway and would have been able to timely react.

bottom of page