Matsko v. Tesla: Too Much Trust in Tesla?

ALLISON SIMON—Who is responsible for accidents caused by self-driving cars? More importantly, are self-driving cars even “self-driving” at all? The recent proposed class action, Mastko v. Tesla, sheds light on these issues. The named plaintiff, Tesla owner Briggs Matsko, claims that Tesla has been misleading the public since 2016 by advertising its Autopilot, Enhanced Autopilot, and Full Self-Driving (“FSD”) features, operated by advanced driver-assistance system (“ADAS”) technology, as equipping Teslas with full self-driving capabilities despite Tesla’s failure to produce “anything even remotely approaching a fully self-driving car.” The Autopilot feature is marketed as automating some steering, braking, and acceleration functions, while the later-developed Enhanced Autopilot and FSD features are marketed as more advanced versions of Autopilot, capable of responding to traffic lights and stop signs, and automatically changing lanes.  

Matsko claims that since 2016, Tesla has pushed out periodical ADAS software updates to its vehicles, but the updates have resulted in vehicles failing to turn, running red lights, and steering directly into other objects. Special investigations into Tesla Autopilot-related crashes by the National Highway Traffic Safety Administration (“NHTSA”) support Matsko’s claims of flaws in Tesla’s ADAS technology. Nineteen fatalities have been reported out of thirty-eight special investigations into crashes related to Tesla’s ADAS technology.

The common factor in most of these crashes is the drivers’ belief that the car can actually drive itself. The drivers of Tesla models equipped with ADAS technology have reportedly moved their attention away from the road, engaging in activities that involve taking their hands off the wheel, such as eating, drinking, texting, using a smartphone, or reading a book. Forty-two percent of Tesla Autopilot users reported being comfortable with their car driving for them without watching the road themselves. 

According to the NHTSA, “no commercially available vehicles today are capable of driving themselves,” which begs the question of why so many Tesla drivers falsely believe their car can drive itself. Matsko attempts to answer this question in his complaint, alleging that Tesla and its CEO, Elon Musk, made deceptive and misleading statements for years that led customers to believe that Tesla vehicles are capable of full autonomy. Matsko claims that such statements induced him to purchase the Enhanced Autopilot feature, a $5,000 add-on to his Tesla vehicle, which comes with basic Autopilot.

Matsko evidences this claim with an official Tesla blog post from October of 2016 that states “as of today, all Tesla vehicles produced in our factory – including Model 3 – will have the hardware needed for full self-driving capability at a safety level substantially greater than that of the human driver.” The blog post features a video advertising the ADAS technology, which displays a driver allowing his vehicle to drive for him supported by the language, “The person in the driver’s seat is only there for legal reasons. He is not driving anything. The car is driving itself.” 

In truth, the 2016 statements released by Tesla and Musk came as a shock to Tesla employees working on the family of Autopilot features. It was later revealed that the video advertising the vehicle as fully autonomous was not an accurate representation, as the vehicle’s route had been digitally charted ahead of time and the vehicle had crashed into a roadside barrier during filming. Head of Tesla’s Autopilot project, Sterling Anderson, had even cautioned against marketing the vehicles as “autonomous” or “self-driving” to avoid misrepresentation. 

In the wake of today’s digital age, the question of where liability falls in accidents involving these features becomes increasingly complicated with the rise of ADAS technology, which serves as automotive Artificial Intelligence. Equipping vehicles with real-time perception, prediction, and decision-making responsibilities poses the question of whether the vehicle actually has full autonomy to the extent that Tesla, as its manufacturer, bears the burden of liability for Autopilot accidents. 

Despite any claim Elon Musk has made suggesting Teslas are fully autonomous, as of today, Tesla’s Autopilot ranks at a Level Two on the Society for Automotive Engineers’ five level scale for vehicle autonomy. This means that the vehicle can automatically steer and brake, but the driver is required to constantly supervise these features and is still considered to be “driving” as a result. Thus, the blame falls on the driver in accidents occurring when Tesla’s Autopilot features are activated. Vehicles that achieve Level Five status do not require the driver to supervise the vehicle and are considered fully self-driving, shifting the burden of liability away from the driver. However, no vehicles with this degree of autonomy are available for consumer purchase, according to the NHTSA. 

Regardless of Musk’s repeated statements that Tesla’s ADAS technology is on the verge of attaining Level Five status (while being stuck at a Level Two for years), Tesla may be in luck due to the warnings it provides to drivers. For one, Tesla’s website warns that “Autopilot, Enhanced Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver . . . the currently enabled features do not make the vehicle autonomous.” Additionally, the vehicle “nags” owners to put their hands on the wheel when Autopilot is activated. 

The claims in Matsko v. Tesla are supported by a recent suit brought by the California Department of Motor Vehicles, also alleging untrue or misleading marketing of Tesla’s Autopilot features. Nonetheless, the fate of these claims rests on whether Tesla’s website and in-vehicle disclaimers effectively cure its contradicting statements suggesting full self-driving capabilities.