A Tesla involved in a fatal crash on a Southern California freeway in the United States last week was operating on autopilot at the time, authorities reported.
The May 5 crash in Fontana, a city 80 km (50 miles) east of Los Angeles, is being investigated by the National Highway Traffic Safety Administration (NHTSA). The survey is the 29th case involving a Tesla to which the agency has responded.
A 35-year-old man died when his Tesla Model 3 attacked a dump truck on a highway around 2:30 a.m. local time (9:30 GMT). The driver’s name has not yet been made public. Another man was seriously injured when the electric vehicle struck him while he was helping the driver of the vehicle get out of the accident.
The California Highway Patrol, or CHP, announced Thursday that the car was powered by Tesla’s partially automated driving system called Autopilot, which has been involved in multiple crashes. The Fontana crash marks at least the fourth U.S. autopilot-related death.
“While the CHP does not usually comment on ongoing investigations, the Department acknowledges the high level of interest focused on accidents affecting Tesla vehicles,” the agency said in a statement. “We considered this information to provide an opportunity to remind the public that driving is a complex task that requires all the driver’s attention.”
The federal safety investigation comes just after the CHP arrested another man who authorities said was in the back seat of a Tesla he was driving this week on Interstate 80, near Oakland, with no one behind the wheel. .
CHP has not said whether officials have determined whether the Tesla of the I-80 incident ran on autopilot, which can keep a car centered in its lane and a safe distance behind vehicles in the in front.
But it is likely that the autopilot or “Full Self-Driving” was in operation for the driver to be in the back seat. Tesla allows a limited number of owners to test their automatic driving system.
Tesla, which has dissolved its public relations department, did not respond to an email Friday seeking comments. The company tells owners ’manuals and its website that both autopilot and“ Full Self-Driving ”are not fully autonomous and that drivers must pay attention and be prepared to intervene at any time.
At times, the autopilot has had trouble dealing with stationary objects and crossing traffic in front of Teslas.
In two crashes in Florida, in 2016 and 2019, autopilot cars in use drove below crossing tractor trailers and killed the men driving the Tesla. In a 2018 crash in Mountain View, California, an Apple engineer driving on autopilot died when his Tesla hit a road barrier.
Tesla’s system, which uses cameras, radar and short-range sonar, also has trouble handling stopped emergency vehicles. Teslas has hit several fire trucks and police vehicles that stopped on highways with flashing emergency lights on.
For example, the NHTSA in March sent a team to investigate after an autopilot Tesla collided with a Michigan state police vehicle on National Highway 96 near Lansing. Neither the soldier nor the 22-year-old Tesla driver were injured, police said.
Following the fatal crashes in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a stronger system to ensure drivers are alert and limit the use of autopilot on highways where it can operate effectively. . Neither Tesla nor the security agency took action.
In a Feb. 1 letter to the U.S. Department of Transportation, NTSB President Robert Sumwalt urged the department to enact regulations governing driver assistance systems such as autopilot, as well as autonomous vehicle testing. NHTSA has relied primarily on voluntary guidelines for vehicles, taking a practical approach so as not to hinder the development of new safety technologies.
Sumwalt said Tesla uses people who have bought cars to test “Full Self-Driving” software on the public highway with limited oversight or information requirements.
“Because NHTSA has not set any requirements, manufacturers can operate and test vehicles virtually anywhere, even if the location exceeds the AV [autonomous vehicle] limitations of the control system, “Sumwalt wrote.
He added: “While Tesla includes a disclaimer that” currently enabled features require active driver supervision and do not make the vehicle autonomous, “the NHTSA’s practical approach to overseeing AV testing is a potential risk to motorists and other road users “.
NHTSA, which has authority to regulate automated driving systems and seek withdrawals if necessary, appears to have developed a renewed interest in the systems since U.S. President Joe Biden took office.