The government has questions about Elon Musk’s Full Self-Driving tweet

[ad_1]

The National Highway Traffic Safety Administration has reached out to Tesla with questions about a tweet by Elon Musk suggesting he would remove an important safeguard from the company’s Full Self-Driving (FSD) system. The news was first reported by the Associated Press.

An NHTSA spokesperson confirmed that the agency has reached out to Tesla to gather information about the Musk tweet, in which the controversial billionaire suggested he would eliminate a driver monitoring function that warns users to keep their hands on the steering wheel while using FSD.

The information gathering by NHTSA is part of a broader investigation into Tesla’s Autopilot, which has been linked to over a dozen crashes involving stationary emergency vehicles.

The information gathering by NHTSA is part of a broader investigation into Tesla’s Autopilot

On December 31st, Musk tweeted that Tesla would push an over-the-air software update in January removing the driver monitoring warning in response to a request by Omar Qazi, a Tesla shareholder who tweets under the handle @WholeMarsBlog. Qazi suggested that FSD Beta users with more than 10,000 miles should have the option to turn off the “steering wheel nag.”

Companies like General Motors and Ford currently sell cars with camera-based eye-tracking systems that are meant to make sure drivers pay attention while using hands-free driving features.

For years, regulators and safety experts have implored Tesla to add better driver monitoring to its cars

Tesla, meanwhile, uses torque sensors embedded in the steering wheel to ensure that drivers keep their hands at the ready. But some drivers have used weights and other methods to trick the system into thinking their hands are on the wheel. Consumer Reports discovered that a heavy chain could be used to simulate hands on the steering wheel, allowing to researchers to “drive” around in a Tesla Model Y for several miles while sitting in the backseat.

All Tesla vehicles today come standard with a driver-assist feature called Autopilot. For an additional $15,000, owners can buy the Full Self-Driving option, which Musk has repeatedly promised will one day deliver fully autonomous capabilities to Tesla vehicle owners. To date, FSD remains a “Level 2” advanced driver-assistance system, meaning the driver must stay fully engaged in the vehicle’s operation while in motion.

FSD, which is currently available to everyone in North America who has purchased the option, allows users to access Autopilot’s partially automated driver-assist system on city streets and local roads. The system purports to speed up and slow down, make turns — including unprotected left turns, which are extremely difficult for automated systems — and recognizes traffic signals and other road signs.

Tesla has gotten in hot water with the federal government based on reports of FSD malfunction and other safety issues. The National Highway Traffic Safety Administration is investigating 16 crashes in which Tesla vehicle owners using Autopilot crashed into stationary emergency vehicles, resulting in 15 injuries and one fatality. Tesla is facing a possible recall of Autopilot, FSD, or both after the government upgraded its investigation earlier this year.

The company has been accused of false advertising by regulators and sued by customers for allegedly misleading them about the capabilities of their vehicles. But FSD is also crucial to Musk’s vision of a fully driverless future. And Musk himself has largely avoided any serious consequences — so far — in his efforts to obscure the limitations of Tesla’s autonomous driving technology.

[ad_2]

Source link