A Matter of Life and Death for Regulators: Is Tesla’s Autopilot Safe?
September 21 (Reuters) – Robin Geoulla had doubts about the automated driving technology fitted to his Tesla Model S when he bought the electric car in 2017.
“It was a little scary to, you know, rely on it and just, you know, sit down and let it drive,” he told a US investigator of Tesla’s autopilot system, describing its first feelings about technology.
Geoulla made the comments to the investigator in January 2018, days after his Tesla, with autopilot on, slammed into the back of an unoccupied fire truck parked on a California highway. Reuters could not reach him for further comment.
Over time, Geoulla’s initial doubts about the autopilot subsided and he found it reliable to follow a vehicle in front of him. But he noticed that the system sometimes seemed confused when faced with direct sunlight or a vehicle ahead changing lanes, according to a transcript of his interview with an investigator from the National Transportation Safety Board (NTSB ).
He was driving in the sun before hitting the fire truck, he told the investigator.
The autopilot design allowed Geoulla to disengage from driving during his trip, and his hands were off the wheel for most of the roughly 30-minute period when the technology was activated, the NTSB found.
The U.S. agency, which makes recommendations but lacks enforcement powers, has previously urged regulators at the National Highway Traffic Safety Administration (NHTSA) to investigate the limitations of autopilot, the potential for driver abuse and possible safety risks as a result of a series of accidents involving the technology, some of them fatal.
“The past has shown that the focus has been on innovation rather than security and I hope we are at a point where the trend is turning,” the new NTSB chairperson told Reuters. , Jennifer Homendy, in an interview. She said there was no comparison between Tesla’s autopilot and the more rigorous autopilot systems used in aviation that involve trained pilots, rules dealing with fatigue and drug testing. and alcohol.
Tesla did not respond to written questions for this story.
Autopilot is an advanced driver assistance feature, the current version of which does not make vehicles autonomous, the company says on its website. Tesla says drivers must agree to keep their hands on the wheel and maintain control of their vehicles before activating the system.
The Geoulla crash in 2018 is one of 12 accidents involving an autopilot that NHTSA officials are examining in the agency’s most thorough investigation since Tesla Inc (TSLA.O) introduced the semi-autonomous driving system in 2015.
According to an NHTSA statement, NTSB documents and police reports reviewed by Reuters, most of the accidents under investigation occurred after dark or in conditions creating limited visibility, like a dazzling sun. This raises questions about the capabilities of the autopilot in difficult driving conditions, according to autonomous driving experts.
“NHTSA’s enforcement and default authority is broad, and we will act when we detect an unreasonable risk to public safety,” an NHTSA spokesperson said in a statement to Reuters.
Since 2016, U.S. auto safety regulators have separately dispatched 33 special accident investigation teams to examine Tesla’s crashes involving 11 fatalities in which advanced driver assistance systems were suspected of being used. The NHTSA ruled out the use of autopilot in three of these non-fatal crashes.
The current NHTSA autopilot investigation is indeed reopening the question of whether the technology is safe. This represents the latest big challenge for Elon Musk, the chief executive of Tesla whose advocacy for driverless cars has helped his company become the world’s most valuable automaker.
Tesla charges customers up to $ 10,000 for advanced driver assistance features such as lane change, with the promise of ultimately providing autonomous driving capability to their cars using only cameras and software. advances. Other automakers and autonomous driving companies are using not only cameras but also more expensive equipment, including radar and lidar, in their current and future vehicles.
Musk said a Tesla with eight cameras would be much safer than human drivers. But camera technology is affected by darkness and glare from the sun as well as adverse weather conditions such as heavy rain, snow and fog, according to experts and industry executives.
“Computer vision today is far from perfect and will be for the foreseeable future,” said Raj Rajkumar, professor of electrical and computer engineering at Carnegie Mellon University.
In the first known fatal accident in the United States involving Tesla’s semi-autonomous driving technology, which occurred in 2016 west of Williston, Florida, the company said the driver and autopilot did not hadn’t seen the white side of a semi-trailer against a brightly lit sky. Instead of braking, the Tesla collided with the 18-wheeled truck.
IMPROPER USE OF THE DRIVER, DEFECTIVE BRAKING
The NHTSA in January 2017 closed an autopilot investigation resulting from the fatal crash, finding no flaws in autopilot performance after controversial discussions with Tesla officials, according to documents reviewed by Reuters.
In December 2016, as part of that investigation, the agency asked Tesla to provide details of the company’s response to any internal security concerns raised about the autopilot, including the potential for abuse or driver abuse, according to a special order sent by regulators to the automaker. .
After an NHTSA lawyer found that Tesla’s initial response was missing, Tesla’s general counsel at the time, Todd Maron, tried again. He told regulators the demand was “far too broad” and that it would be impossible to catalog all of the concerns raised during the development of the autopilot, according to correspondence reviewed by Reuters.
Nonetheless, Tesla wanted to cooperate, Maron told regulators. During Autopilot’s development, company employees or contractors raised concerns with Tesla about the potential for unintentional or failed braking and acceleration; unwanted or failing leadership; and certain types of abuse and abuse by drivers, Maron said, without providing further details.
Maron did not respond to messages seeking comment.
The reaction of regulators is unclear. A former U.S. official said Tesla generally cooperates with the probe and quickly produces the requested documents. Regulators closed the investigation just before the inauguration of former US President Donald Trump, finding that the autopilot was working as intended and that Tesla had taken steps to prevent it from being misused.
NHTSA LEADERSHIP VACUUM
The NHTSA has been without a Senate confirmed leader for nearly five years. President Joe Biden has yet to name anyone to head the agency.
NHTSA documents show regulators want to know how Tesla vehicles attempt to see flashing lights on emergency vehicles, or detect the presence of fire trucks, ambulances and police cars in their path. The agency also sought similar information from 12 rival car manufacturers.
“Tesla has been asked to produce and validate data as well as their interpretation of that data. NHTSA will perform our own independent validation and analysis of all information,” NHTSA told Reuters.
Musk, the pioneer of the electric car, fought hard to defend the autopilot against critics and regulators. Tesla used Autopilot’s ability to update vehicle software wirelessly to override and bypass the traditional vehicle recall process.
Musk has repeatedly promoted autopilot capabilities, sometimes in a way that critics say misleads customers into believing Teslas can drive themselves – despite warnings to the contrary in owner’s manuals. that ask drivers to stay engaged and point out the limits of technology.
Musk also continued to launch what Tesla calls beta – or unfinished – versions of a “Full Self-Driving” system via over-the-air software upgrades.
“Some manufacturers are going to do whatever they want to sell a car and it’s up to the government to control that,” said Homendy of the NTSB.
Reporting by Hyunjoo Jin in San Francisco, Mike Spector in New York and David Shepardson in Washington Editing by Joseph White and Matthew Lewis
Our Standards: The Thomson Reuters Trust Principles.