The Tesla technology is designed to assist with steering, braking, speed and lane changes but its features “do not make the vehicle autonomous,” the company says on its website.
To create the video, the Tesla used 3D mapping on a predetermined route from a house in Menlo Park, California, to Tesla’s then-headquarters in Palo Alto, he said.
Drivers intervened to take control in test runs, he said. When trying to show the Model X could park itself with no driver, a test car crashed into a fence in Tesla’s parking lot, he said.
“The intent of the video was not to accurately portray what was available for customers in 2016. It was to portray what was possible to build into the system,” Elluswamy said, according to a transcript of his testimony seen by Reuters.
When Tesla released the video, Musk tweeted, “Tesla drives itself (no human input at all) thru urban streets to highway to streets, then finds a parking spot.”
Tesla faces lawsuits and regulatory scrutiny over its driver assistance systems.
The U.S. Department of Justice began a criminal investigation into Tesla’s claims that its EVs can drive themselves in 2021, after a number of crashes, some of them fatal, involving Autopilot, Reuters has reported.
The New York Times reported in 2021 that Tesla engineers had created the 2016 video to promote Autopilot without disclosing that the route had been mapped in advance or that a car had crashed in trying to complete the shoot, citing anonymous sources.
When asked if the 2016 video showed the performance of the Tesla Autopilot system available in a production car at the time, Elluswamy said, “It does not.”
Elluswamy was deposed in a lawsuit against Tesla over a 2018 crash in Mountain View, Calif., that killed Apple engineer Walter Huang.
Andrew McDevitt, the lawyer who represents Huang’s wife and who questioned Elluswamy in July, told Reuters it was “obviously misleading to feature that video without any disclaimer or asterisk.”
The National Transportation Safety Board concluded in 2020 that Huang’s fatal crash was likely caused by his distraction and the limitations of Autopilot. It said Tesla’s “ineffective monitoring of driver engagement” had contributed to the crash.
Elluswamy said drivers could “fool the system,” making a Tesla system believe that they were paying attention based on feedback from the steering wheel when they were not. But he said he saw no safety issue with Autopilot if drivers were paying attention.