Auto Express

Tesla’s ‘Full Self-Driving’ Safety Complaint Investigated by NHTSA



DETROIT – WE safe car Regulators are considering complaints from a Tesla driver whose “Full Self-Driving” software caused the accident.

The driver was beta testing the “Full Self-Driving” software and the Tesla SUV went into the wrong lane and was hit by another vehicle, according to the driver’s complaint with National Highway Traffic Safety Administration.

“The car was in the wrong lane and I was hit by another driver in the lane next to my lane,” the driver wrote.

Vehicle, one 2021 Tesla Model Y According to the complaint, the driver alerted the driver in advance while making a halfway turn, and the driver attempted to steer to avoid other traffic, according to the complaint. But the vehicle lost control and “forced itself into the incorrect lane, creating an unsafe maneuver that put everyone involved at risk,” the driver wrote.

No one was injured in the collision, but Model Y suffered serious damage on the part of the driver, according to a complaint filed with the agency online Monday and posted in its public complaint database.

The crash happened on November 3 and the driver’s location is Brea, California, but the crash’s location has yet to be determined. NHTSA did not disclose the names of those filing the complaint.

Likely the first complaint filed with the agency alleging that the “Full Self-Driving” software caused the problem. A notice was left on Friday, seeking comment from Tesla, which has disbanded its media relations division.

An NHTSA spokesman said Friday night the agency was aware of the complaint and was in contact with Tesla for more information. The spokesperson said that people should report safety concerns to the agency.

The investigation is another sign that NHTSA is becoming more active in monitoring autonomous and partially automated driving systems in the President Joe Biden. In the past, the agency has been reluctant to regulate systems, saying it does not want to delay potentially life-saving technology.

Tesla says that “Autopilot” and “Full Self-Driving” are driver assistance systems and cannot be self-driving, despite their names. The automaker said drivers must be ready to intervene at any time.

Selected Tesla drivers have been beta testing the software on public roads, a practice that critics say endangers others because of flawed software and untrained drivers. Other companies are testing on public roads with safe drivers on board ready to intervene.

Beta testing is a field test of software performed by users before the full commercial release is ready.

Critics have called for NHTSA to act after several videos were posted on the internet that allegedly showed Tesla’s software faulty and drivers had to take action.

“Hopefully this gives @NHTSAgov the ammunition it needs to act on the FSD now instead of waiting for Tesla to lose time through the partial release of data,” said Philip Koopman, professor of electrical engineering and computers at Carnegie Mellon University, wrote on Twitter.

In June, NHTSA ordered automakers to report any incidents involving self-driving car or partially automated driving assistance systems. It’s not clear if Tesla reported the crash involving the California driver. Two months later, they opened a formal investigation into Tesla’s Autopilot partial autonomous driver assistance system after a series of collisions with parked emergency vehicles.

NHTSA has asked Tesla for information about beta testing, including asking testers not to disclose information. The agency said that non-disclosure agreements could impede its ability to investigate.



Source link

news7h

News7h: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button