Data Could Assist Tesla in Defense Against Autopilot Liability

By Jessica Dye | July 22, 2016

The large volume of data Tesla Motors Inc collects from its cars on the road has armed it with information to publicly counter, and possibly legally defend, claims about the safety of its Autopilot driving-assist software, according to lawyers familiar with such cases.

Autopilot, available in Tesla’s Model S and Model X vehicles, helps users steer and stay in lanes. It has come under increased scrutiny since a driver died in a May 7 Florida crash while using the semi-autonomous technology.

Should Tesla be called on to defend the technology in court, lawyers said information collected by the electronic car maker could be central in judging liability.

The stakes are high for an industry that is investing heavily in self-driving technologies and car connectivity, which are frequently touted as safety improvements. As more connected cars roll out, automakers will have access to more detailed data as to what a vehicle, and its driver, were doing before a crash.

That kind of information has already begun appearing in courtrooms from event data recorders, or EDRs, which have become standard in new cars over the past decade. Often called automotive “black boxes,” EDRs generally record data like speed, seat-belt usage and pedal position in the seconds before and after a crash.

“I’ve had EDRs that have helped me in a case, and EDRs that have basically told me, don’t take that case,” said Don Slavik, a plaintiffs’ attorney who handles automotive product liability cases.

Tesla’s wireless data collection appears to be more extensive than that of many onboard EDRs. Following a July 1 crash in Pennsylvania, Tesla was able to see whether Autopilot was engaged, whether the driver’s hands were detected on the steering wheel and the amount of force applied to the accelerator, company spokesperson Khobi Brooklyn said in an interview last week.

A Reuters review of court dockets nationwide did not reveal any claims filed against Tesla over crashes while Autopilot was engaged. In the event of a lawsuit, though, the company’s information could “be very helpful if it can be validated and verified and has sufficient clarity,” said Slavik.

Brooklyn declined to comment on the potential use of such data in litigation. The company states in user manuals that it reserves the right to use collected data for its defense in a lawsuit.

Tesla’s privacy policy allows drivers to opt out of sharing their cars’ information with the company, but warns that doing so could result in “reduced functionality, serious damage or inoperability,” and the disabling of certain vehicle features.

Tesla has frequently used its data to defend itself in the court of public opinion. Last week, Tesla Chief Executive Officer Elon Musk seized on that data for a July 14 Twitter post in which he said on-board vehicle logs showed Autopilot would have prevented the Pennsylvania crash but it was turned off at the time.

The company has said its data suggested that in a recent Model X accident in Montana, the driver’s hands were off the wheel while the car’s Autosteer function was engaged. Tesla’s terms of use for that system require drivers to keep their hands on their wheel.

The courtroom, however, is different from the court of public opinion, said Bryant Walker Smith, an assistant professor of law and engineering at the University of South Carolina, who studies self-driving vehicles.

“Summaries and spin will be much less credible than analysis supported by raw data that others can evaluate,” he said.

In the fatal Florida crash, Tesla has said the car’s Autopilot system was engaged. Investigators are still trying to determine the cause of the accident.

(Reporting by Jessica Dye; Editing by Anthony Lin and Jeffrey Hodgson)

Was this article valuable?

Here are more articles you may enjoy.