Auto Express

Feds Investigate Musk’s Tweets About Disabling Tesla’s Practice Alerts


DETROIT – A tweet from Elon Musk indicates that Tesla could allow some owners who are testing a “Full Self-Driving” system to turn off a warning that prompts them to keep their hands on the wheel, which has drawn attention from US safety regulators.

The National Highway Traffic Safety Administration said it had asked Tesla for more information about the tweet. Last week, the agency said the matter was now part of a broader investigation At least 14 Teslas crashed into an ambulance while using the Autopilot driver assistance system.

Since 2021, Tesla has been beta testing “Full Self-Driving” using owners who have not been trained in the system but are actively monitored by the company. Earlier this year, Tesla said 160,000, about 15% of the Tesla vehicles currently on the road in the US, were involved. A wider distribution of the software has been rolling Launched at the end of 2022.

Despite the name “Full Self-Driving,” Tesla still says on its website that these cars cannot drive themselves. Teslas using “FSD” can navigate the road on their own in many cases, but experts say the system can make mistakes. “We’re not saying it’s ready to leave no one behind the wheel,” CEO Musk said in October.

On New Year’s Eve, one of Musk’s biggest fans posted on Twitter that motorists with more than 10,000 miles who have tested “Full Self-Driving” should have the option to turn off “steering wheel nagging.” ”, a warning asks the driver to continue. hands on wheels.

Musk replied: “Agreed, the update will be available in January.”

It’s unclear from the tweets exactly what Tesla will do. “Disabling the driver monitoring system on any vehicle that automates speed and steering puts drivers at risk,” said Jake Fisher, senior director of automotive testing at Consumer Reports. another car on the road.

“Using the FSD beta, you are part of the experiment,” says Fisher. “The problem is that other road users next to you have not signed up for that test.”

Tesla did not respond to messages seeking comment on the tweet or monitoring its drivers.

car safety supporters and government investigator has long criticized Tesla’s surveillance system as inadequate. Three years ago National Transportation Safety Board lists poor supervision was a contributing factor to Tesla’s 2018 fatal crash in California. The council proposed a better system, but said Tesla had not responded.

Tesla’s system measures torque on the steering wheel to try to make sure the driver is paying attention. Many Teslas have cameras that track the driver’s gaze. But Fisher says those cameras are not infrared like those of some competitor’s driver assistance systems, so they can’t be seen at night or if the driver is wearing sunglasses.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, argues that Tesla is contradicting itself in a way that can confuse drivers. “They’re trying to make the customer happy by taking their hands off the wheel, even when the (owner) manual says ‘don’t do that’.”

Indeed, Tesla’s website says Autopilot and the more sophisticated “Full Self-Driving” system are designed to be used by a “fully attentive driver who has his or her hands on the wheel and is ready to take over any situation.” any time.” It says that the systems are not completely autonomous.

NHTSA has noted in the documents that numerous Tesla accidents have occurred in which the driver had his hand on the wheel but remained unnoticed. The agency has said that Autopilot is being used in areas where its capabilities are limited and many drivers fail to act to avoid a collision despite warnings from the vehicle.

Tesla’s partially automated systems have been under investigation by the NHTSA since June 2016 when a driver using Autopilot was killed after his Tesla went under a tractor-trailer crossing the road in Florida. The separate investigation into Teslas using Autopilot when they crashed into emergency vehicles began in August 2021.

Including the Florida crash, NHTSA sent investigators to 35 Tesla accidents suspected of using automated systems. Nineteen people died in those accidents.

Consumer Reports tested Tesla’s monitoring system, which changes frequently with online software updates. Initially, the system did not warn the driver not to put his hands on the steering wheel for 3 minutes. Recently, however, the warnings appeared after just 15 seconds. However, Fisher said he’s not sure how long the driver’s hand can stay off the wheel before the system slows down or turns off completely.

With “steering wheel hum” off, Fisher said, Fisher could switch to the camera to monitor the driver, but that wasn’t obvious.

While it’s implied by the names that Autopilot and “Full Self-Driving” can drive themselves, Fisher said, it’s clear that Tesla wants owners to remain drivers. But the NTSB says motorists can end up letting their guard down and relying too much on the system while looking elsewhere or performing other tasks.

Fisher said that those using “Full Self-Driving” may be more cautious in controlling because the system makes mistakes.

“I don’t want to take my hands off the wheel when using that system, just because it can do unexpected things,” he said.

Koopman said he doesn’t see a major safety risk in disabling the steering wheel because Tesla’s monitoring system is so flawed that disabling it doesn’t necessarily make Teslas any more dangerous.

He said NHTSA has enough evidence to force Tesla to install a better surveillance system.

The agency said it does not comment on open investigations.



Source by [author_name]

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button