Tesla engineer testified that a promotional self-driving video was staged Pipa News

Tesla engineer testified that a promotional self-driving video was staged

A 2016 video that Tesla used to promote its self-driving technology was staged to show capabilities such as stopping at a red light and accelerating at a green light that the system lacked, according to testimony from a senior engineer.

The video, which remains archived on Tesla’s website, was released in October 2016 and promoted on Twitter by CEO Elon Musk as proof that Tesla drives itself.

But the Model X itself didn’t drive with technology Tesla deployed, Ashok Elluswamy, director of autopilot software at Tesla, said in the transcript of a July statement used as evidence in a lawsuit against Tesla for a fatal crash in 2018 involving a former Apple engineer. .

Elluswamy’s previously unreported testimony represents the first time a Tesla employee has confirmed and detailed how the video was produced.

The video features a tagline that reads, “The person in the driver’s seat is only there for legal reasons. He’s not doing anything. The car drives itself.”

Elluswamy said Tesla’s autopilot team wanted to engineer and record a “demonstration of the system’s capabilities” at Musk’s request.

Elluswamy, Musk and Tesla did not respond to a request for comment. However, the company has warned drivers to keep their hands on the wheel and maintain control of their vehicles while using autopilot.

The Tesla technology is designed to assist with steering, braking, speed and lane changes, but the features “do not make the vehicle autonomous,” the company says on its website.

To create the video, Tesla used 3D mapping on a predetermined route from a house in Menlo Park, California, to Tesla’s then headquarters in Palo Alto, he said.

Drivers stepped in to take control during test drives, he said. While trying to show that the Model X could park itself without a driver, a test car crashed into a fence in Tesla’s parking lot, he said.

“The intent of the video was not to accurately represent what was available to customers in 2016. It was to show what was possible to build into the system,” said Elluswamy, according to a transcript of his testimony seen by Reuters.

Ministry of Justice investigates after series of crashes

When Tesla released the video, Musk tweeted, “Tesla drives herself (no human input at all) through urban streets to highway to streets and then finds a parking spot.”

Tesla faces lawsuits and regulatory oversight over its driver assistance systems.

The U.S. Justice Department has opened a criminal investigation into Tesla’s claims that its electric vehicles will be self-driving in 2021 after a number of crashes, some fatal, involving autopilot, Reuters reports.

LOOK | More questions for Tesla after 2 deaths in Texas crash:

Authorities investigate deadly Tesla crash with no one behind the wheel

Two men died after their Tesla MODEL S crashed into a tree and caught fire on April 17 in The Woodlands, Texas. One man was discovered in the front passenger seat and the other in the back, prompting authorities to investigate whether the car is in the fully self-driving mode that Tesla is promoting ahead of a wider release of its upgrade from semi-automatic driving.

The New York Times reported in 2021 that Tesla engineers made the 2016 video to promote Autopilot without revealing that the route had been mapped out in advance or that a car had crashed completing the shoot, citing to anonymous sources.

When asked if the 2016 video showed the performance of the Tesla autopilot system available in a production car at the time, Elluswamy said, “It doesn’t.”

Elluswamy was dropped in a lawsuit against Tesla over a 2018 crash in Mountain View, California, that killed Apple engineer Walter Huang, 38.

Andrew McDevitt, the attorney representing Huang’s wife who questioned Elluswamy in July, told Reuters it was “clearly misleading to show that video without any disclaimer or asterisk.”

The National Transportation Safety Board concluded in 2020 that Huang’s fatal crash was likely caused by his distraction and autopilot’s limitations. It said Tesla’s “ineffective monitoring of driver involvement” contributed to the crash.

Elluswamy said drivers can “fool the system,” leading a Tesla system to think they’re paying attention based on feedback from the steering wheel, when they weren’t. But he said he didn’t see a safety issue with the autopilot if drivers pay attention.


Most Popular

Most Popular