NHTSA released the results of its preliminary investigation into a fatal Tesla accident on Thursday, June 30. On May 7, an Ohio motorist was driving his Tesla Model S on a highway in Williston, Florida, when a semi tractor-trailer pulled out in front of him. According to the Wall Street Journal, the vehicle’s Autopilot system allegedly did not detect the truck, and the 40-year-old man died in a collision between the two vehicles.
Tesla Comments on Fatal Crash
Tesla subsequently released a statement in which it admitted that its Autopilot system did not automatically apply the brakes because it did not detect the white side of the trailer against a bright sky. Tesla also pointed to the height of the trailer and a set of “extremely rare circumstances” that resulted in both the autopilot and the motorist himself failing to detect the trailer blocking the road ahead. As a result, the Tesla Model S went right under the trailer.
Autopilot System Details
Tesla’s autopilot feature is what the National Highway Transportation Safety Administration (NHTSA) classifies as “Level 2” autonomy. Some believe the Model S Autopilot system actually deserves a “Level 3” classification, meaning that the vehicle can in fact drive itself at times without human intervention. The deceased motorist was allegedly using the system in this manner when the crash occurred. In fact, the trucker says he observed the Tesla motorist watching the movie “Harry Potter” just before impact.
Allegedly, the driver repeatedly tested the limits of Tesla’s system in the months before he died. For example, he posted a YouTube video that purportedly demonstrated the Autopilot system maneuvering his vehicle out of harm’s way in a possible crash scenario.
The Wall Street Journal reports that regulators may release further guidelines for autonomous cars later this summer. In general, autonomous technologies are welcomed by federal regulators, industry organizations and others motivated to reduce the 90+ percent of traffic deaths caused by human error.
Liability Questions Arise
The WSJ says that driverless and autonomous technologies offer an opportunity for safer driving, but “they also raise questions about liability in the event of a crash.” If WSJ is correct, this raises the specter of complex liability litigation in the future when injurious and/or fatal wrecks involve cars engaged in some level of autonomous operation.
This is the concern that critics have – that situations will continue to arise where vehicles with autonomous or automated safety features encounter circumstances that the software developers and the engineers did not plan for or anticipate. Complicating matters is the claim by some that both regulators and insurers have been slow to respond to the increasing use of both autonomous and automated safety systems. As a result, liability questions appear to be increasing as various federal and state agencies attempt to sort out the jurisdictional and other issues.
General Motors is also developing a Level 2 autonomous system it calls “Supercruise.” However, GM has reportedly delayed the introduction of the system as it continues to identify and correct remaining glitches and other issues.
Although the circumstances of every traffic fatality are unique, in some cases, a civil court may hold a manufacturer responsible for vehicle technology or components that does not perform as intended. In some situations, it is possible to seek compensation for a variety of losses and expenses, including, but not limited to, medical expenses, pain, suffering and possible lost wages. When a person dies, survivors may see compensation for loss of companionship through wrongful death litigation.
If you or a family member is a victim in a traffic accident, we make it possible for you to discuss the matter with an attorney free of charge. We fight hard to get our clients the monetary awards they deserve under the law. For more information about our services, please contact us.