Tesla staged Autopilot demo video, says director of software

[ad_1]

Tesla’s much-hyped video of its Autopilot driver-assist system “driving by itself” from 2016 was not actually driving itself, according to Ashok Elluswamy, Tesla’s director of Autopilot software.

In a recent deposition, Elluswamy said that the video titled “Full Self-Driving Hardware on All Teslas” was intended to “portray what was possible to build the system” rather than what customers could actually expect the system to do.

The video, which Tesla CEO Elon Musk tweeted a link to saying that “Tesla drives itself,” shows a Tesla driving and parking itself, avoiding obstacles, and obeying red and green lights. The video starts with a title card saying that “the person in the driver’s seat is only there for legal reasons” and that “he is not doing anything. The car is driving by itself.”

But according to Elluswamy, the demo was “specific to some predetermined route,” compared to the production version of the tech that was just relying on input from cameras and sensors. “It was using additional premapped information to drive,” he said, after telling lawyers that the route the car followed had previously been 3D mapped. At the time the video was being made, Elluswamy was an engineer on the team that helped with the video.

In other words, Tesla’s Autopilot was not capable of dynamic route planning, instead requiring the company’s engineers to map out the route it would take for the purposes of the promotional video.

The New York Times had previously reported the premapping, pointing out that consumers using the system wouldn’t have that luxury, but now, we have it on the record from a Tesla official. The deposition, which you can read in full below, was taken as part of a lawsuit filed by the family of Wei “Walter” Huang, who died in 2018 when his Model X with Autopilot engaged crashed into a highway barrier.

Elluswamy also said that the version of Autopilot that was available when the video was produced had “no traffic-light-handling capability,” despite it being shown in the video. What isn’t clear is how exactly the video was made; Elluswamy says he doesn’t recall whether the person in the driver’s seat controlled any acceleration or braking or if the car did it. It’s also not clear if the car was running software capable of recognizing traffic signals.

The admission isn’t the only part of Elluswamy’s deposition that’s raising eyebrows. Mahmood Hikmet, head of research and development at Ohmio Automation, highlighted parts of the transcript where Elluswamy said he doesn’t know about fundamental safety considerations, such as Operational Design Domain, also known as ODD.

The phrase refers to situations, like geography or weather, in which an autonomous vehicle is allowed to operate. For example, if an autonomous vehicle is only capable of driving in a specific city in ideal weather conditions, then a rainy day in a different city would be outside of its ODD.

While you wouldn’t expect the phrase to come up in everyday conversation or appear in marketing materials, it is definitely something you’d expect the person directing the Autopilot program to know about. The Society of Automotive Engineers (SAE), the organization behind the levels of autonomy that Tesla itself has referenced, calls ODD “the key to autonomous vehicle safety,” and Waymo put out an entire study evaluating its software’s performance in a specific domain.

Musk seemed to show disregard for thinking about ODD during a podcast appearance with Lex Fridman. He said that the acronym “sounds like ADD,” then proceeded to answer a question about the philosophy behind Tesla’s wide-ranging ODD (compared to other systems like GM’s Super Cruise, which will only work in certain conditions) by saying that it’s “pretty crazy” to let humans drive cars instead of machines.



[ad_2]

Source link