Tesla came under renewed scrutiny Thursday following a report its cars could be fooled into driving with no one behind the wheel, as two senators demanded a vigorous federal probe of a fatal crash in Texas.
The accident, which killed two people, involved a Tesla Model S that caught fire after hitting a tree late Saturday near Houston. Investigators found no one in the driver’s seat of the vehicle, local police said.
US transportation regulators investigating the crash are “still gathering facts,” Transportation Secretary Pete Buttigieg said Thursday, adding that investigators have been “in touch” with law enforcement and the automaker.
“This is an important time to stress that a lot of automated driver assistance systems continue to depend on the expectation of an attentive driver behind the wheel,” Buttigieg said.
The developments add to questions about the high-flying electric car maker led by the mercurial Elon Musk, who said earlier this week that data logs show Autopilot was not engaged during the Texas crash.
On its website, Tesla describes Autopilot as a driver enhancement system that, despite its name, requires a human operator.
“Autopilot enables your car to steer, accelerate and brake automatically within its lane,” the website says. “Current Autopilot features require active driver supervision and do not make the vehicle autonomous.”
But engineers from Consumer Reports “easily tricked” Tesla’s Autopilot to drive without anyone in the driver’s seat, “a scenario that would present extreme danger if it were repeated on public roads,” the magazine said on its website Thursday.
Chance of recall?
Also Thursday, Democratic Senators Richard Blumenthal of Connecticut and Ed Markey of Massachusetts urged US auto safety regulators to forcefully respond to the Texas crash, noting that Tesla has been “criticized for misrepresenting the capabilities of their vehicles’ automated driving and driver assistance systems, giving drivers a false sense of security.”
The senators called for “a thorough investigation of the accident and request that your reports include recommendations on corrective actions that can be implemented to prevent future such accidents from occurring,” they said in a letter to National Highway Traffic Safety Administration acting chief Steven Cliff.
The senators applauded NHTSA’s prior announcement that it, along with the National Transportation Safety Board, were investigating the crash.
Jason Levine, executive director of the Center for Auto Safety, called on Tesla to “turn over any data it has regarding this tragedy, even before a court orders them to do so, to help federal investigators get to the bottom of how it was possible for a vehicle to travel into a tree at a high enough speed to kill two passengers when no one was behind the wheel,” he said in an email to AFP.
“Then NHTSA needs to take a hard look at whether the combination of the technology behind Tesla’s ‘Autopilot’ feature and continued evidence of consumers believing this technology is driverless has created an unreasonable risk to motor vehicle safety — a conclusion which could trigger a recall.”
Consumer Reports expressed shock at how easily Autopilot can be duped. It said that shows “driver monitoring systems need to work harder to keep drivers from using systems in foreseeably dangerous ways.”
In its test, a Consumer Reports researcher placed a weight on the steering wheel and manuevered over to the passenger seat without undoing the seat belt.
“There were no warnings that no one was sitting in the seat, no one was holding the steering wheel and no one was looking at the road,” Consumer Reports’ Jake Fisher said of the test on a Tesla Model Y SUV, which was videotaped.
“It continued to drive with no warnings to the driver to stay engaged. We were surprised how easy it was to defeat the insufficient safeguards.”
The magazine contrasted Tesla with cars made by General Motors, Subaru and other automakers which use camera-based systems to track the movements of a driver’s eyes and ensure the vehicles are not unmanned.