[ad_1]
A multi-year investigation into the safety of Tesla’s driver assistance systems by the National Highway Traffic Safety Administration, or NHTSA, is drawing near a close.
Reuters’ David Shepardson first reported on the latest developments Thursday, citing NHTSA acting administrator Ann Carlson. CNBC confirmed the report with the federal vehicle safety regulators.
A spokesperson for NHTSA declined to disclose further details, but told CNBC in an e-mail, “We confirm the comments to Reuters,” and “NHTSA’s Tesla investigations remain open, and the agency generally does not comment on open investigations.”
The agency initiated a safety probe of Tesla’s driver assistance systems — now marketed in the U.S. as Autopilot, Full Self-Driving and FSD Beta options — in 2021 after it identified a string of crashes in which Tesla drivers, thought to be using the company’s driver assistance systems, crashed into first responders’ stationary vehicles.
Despite their names, none of Tesla’s driver assistance features make their cars autonomous. Tesla cars cannot function as robotaxis like those operated by GM-owned Cruise or Alphabet‘s Waymo. Instead, Tesla vehicles require a human driver at the wheel, ready to steer or brake at any time. Tesla’s standard Autopilot and premium Full Self-Driving systems only control braking, steering and acceleration in limited circumstances.
Tesla CEO Elon Musk — who also owns and runs the social network X (formerly Twitter) — often implies Tesla cars are autonomous. For example, on July 23, an ex-Tesla employee who led the company’s AI software engineering posted on the social network about ChatGPT, and how much that generative AI tool impressed his parents when he showed it to them for the first time. Musk responded: “Same happens with Tesla FSD. I forget that most people on Earth have no idea cars can drive themselves.”
In its owners’ manuals, Tesla tells drivers who use Autopilot or FSD: “Keep your hands on the steering wheel at all times and be mindful of road conditions, surrounding traffic, and other road users (such as pedestrians and cyclists). Always be prepared to take immediate action. Failure to follow these instructions could cause damage, serious injury or death.”
The company’s cars feature a driver monitoring system which employs in-cabin cameras and sensors in the steering wheel to detect whether a driver is paying adequate attention to the road and driving task. The system will “nag” drivers with a chime and message on the car’s touchscreen to pay attention and put their hands on the wheel. But it’s not clear that this is a strong enough system to ensure safe use of Tesla’s driver assistance features.
Tesla has previously conducted voluntary recalls of its cars due to other problems with Autopilot and FSD Beta and promised to deliver over-the-air software updates that would remedy the issues. But in July, the agency required Elon Musk’s automaker to send more extensive data on the performance of their driver assistance systems to evaluate as part of its Autopilot safety investigations.
NHTSA publishes data regularly on car crashes in the U.S. that involved advanced driver assistance systems like Tesla Autopilot, Full Self Driving or FSD Beta, dubbed “level 2” under industry standards from SAE International.
The latest data from that Standing General Order crash report says there have been at least 26 incidents involving Tesla cars equipped with level 2 systems resulting in fatalities from August 1, 2019 through mid-July this year. In 23 of these incidents, the agency report says, Tesla’s driver assistance features were in use within 30 seconds of the collision. In three incidents, it’s not known whether these features were used.
Ford is the only other automaker reporting a fatal collision that involved one of its vehicles equipped with level 2 driver assistance. It was not known if the system was engaged preceding that crash, according to the NHTSA SGO report.
Tesla did not respond to a request for comment.
[ad_2]
Source link