Two US Senators on Thursday called for a vigorous probe into American electric car company Tesla’s most recent car crash that left two dead after a report surfaced that said its automatic cars can be fooled into driving even when there is no one on the driver’s seat.
The car, Tesla Model S, caught fire near Houston after hitting a tree on Saturday. According to the local police, investigators found no one on the driver’s seat of the car.
Democratic Senators Richard Blumenthal of Connecticut and Ed Markey of Massachusetts on Thursday urged the US auto safety regulators to forcefully respond to the Texas crash, noting that Tesla has been “criticized for misrepresenting the capabilities of their vehicles’ automated driving and driver assistance systems, giving drivers a false sense of security.”
Also Read: 2 killed as ‘driverless’ Tesla car crashes, Elon Musk denies it was on autopilot
In a letter to National Highway Traffic Safety Administration acting chief Steven Cliff, the senators called for “a thorough investigation of the accident and request that your reports include recommendations on corrective actions that can be implemented to prevent future such accidents from occurring.”
US transportation regulators investigating the crash are “still gathering facts,” Transportation Secretary Pete Buttigieg said Thursday, adding that investigators have been “in touch” with law enforcement and the automaker.
“This is an important time to stress that a lot of automated driver assistance systems continue to depend on the expectation of an attentive driver behind the wheel,” Buttigieg said.
The developments add to questions about the high-flying electric car maker led by the mercurial Elon Musk, who said earlier this week that data logs show Autopilot was not engaged during the Texas crash.
On its website, Tesla describes Autopilot as a driver enhancement system that, despite its name, requires a human operator.
“Autopilot enables your car to steer, accelerate and brake automatically within its lane,” the website says. “Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”
But engineers from Consumer Reports “easily tricked” Tesla’s Autopilot to drive without anyone in the driver’s seat, “a scenario that would present extreme danger if it were repeated on public roads,” the magazine said on its website Thursday.
Also Read: Tesla hits China speed bump with blowback over safety, service
The senators applauded NHTSA’s prior announcement that it, along with the National Transportation Safety Board, were investigating the crash.
Jason Levine, executive director of the Center for Auto Safety, called on Tesla to “turn over any data it has regarding this tragedy, even before a court orders them to do so, to help federal investigators get to the bottom of how it was possible for a vehicle to travel into a tree at a high enough speed to kill two passengers when no one was behind the wheel,” he said in an email to AFP.
“Then NHTSA needs to take a hard look at whether the combination of the technology behind Tesla’s ‘Autopilot’ feature and continued evidence of consumers believing this technology is driverless has created an unreasonable risk to motor vehicle safety — a conclusion which could trigger a recall.”
Consumer Reports expressed shock at how easily Autopilot can be duped. It said that shows “driver monitoring systems need to work harder to keep drivers from using systems in foreseeably dangerous ways.”
“There were no warnings that no one was sitting in the seat, no one was holding the steering wheel and no one was looking at the road,” Consumer Reports’ Jake Fisher said of the test on a Tesla Model Y SUV, which was videotaped.
A Consumer Reports researcher, in its test, placed a weight on the steering wheel and manoeuvred over to the passenger seat without undoing the seat belt.
“It continued to drive with no warnings to the driver to stay engaged. We were surprised how easy it was to defeat the insufficient safeguards.”
The magazine contrasted Tesla with cars made by General Motors, Subaru and other automakers which use camera-based systems to track the movements of a driver’s eyes and ensure the vehicles are not unmanned.