Tesla and Industry Criticized at Hearing on Autopilot System

Tesla and Industry Criticized at Hearing on Autopilot System

WASHINGTON — A top federal safety official said on Tuesday that Tesla, the auto industry and regulators were not doing enough to prevent accidents involving advanced driver-assistance systems like Tesla’s Autopilot.

The criticism from the official, Robert L. Sumwalt, who is chairman of the National Transportation Safety Board, came at the start of a hearing at which the board is expected to conclude its investigation into a fatal 2018 crash involving a Tesla in California.

“Industry keeps implementing technology in such a way that people can get injured or killed, ignoring this board’s recommendations intended to help them prevent such tragedies,” Mr. Sumwalt said.

At Tuesday’s meeting, safety board’s staff will present findings, analysis and recommendations from its investigation into the 2018 crash, which killed the car’s driver, Wei Huang. Mr. Sumwalt also said other federal regulators had provided “scant oversight” to the auto industry.

The board’s staff and members issued sharp criticism for the National Highway Traffic Safety Administration, which they accused of taking a “misguided” and “hands-off” approach to regulating such technology.

In a statement, N.H.T.S.A. said that all crashes caused by distracted driving, including those in which driver-assistance systems were in use, were a “major concern” and that it planned to review the board’s report.

The board meeting is the latest development in a string of federal investigations into crashes involving Tesla’s Autopilot system, which can, among other things, keep a moving car in its lane and match the speed of surrounding vehicles. Tesla has said that the system should be used only under certain conditions, but some safety experts say the company doesn’t do enough to educate drivers about those limitations or take steps to make sure drivers do not become overly reliant on the system and, thus, distracted.

Mr. Huang had been playing a game on his phone during the drive, but it was not clear whether he was engaged with the game in the moments before the crash, according to the investigation.

The concerns about Autopilot have done little to slow Tesla’s rise. The company’s share price has more than tripled since October as Tesla’s financial performance has surpassed even the rosiest of analyst expectations. In September, Tesla earned its first safety award from the nonprofit Insurance Institute for Highway Safety and, last week, Consumer Reports named Tesla’s first mass-market electric car, the Model 3, one of its top picks for 2020.

Tesla did not respond to a request for comment, but the company has previously said that Autopilot makes its vehicles safer. In the fourth quarter of 2019, the company reported one accident for every three million miles driven in a Tesla with Autopilot engaged. Over all, the national rate was one accident for every 498,000 miles driven in 2017, according to N.H.T.S.A.

Still, the electric carmaker faces scrutiny on multiple fronts. The N.T.S.B. and the traffic safety administration are currently investigating more than a dozen crashes in which Autopilot might have played a role.

In the 2018 accident, Autopilot had been engaged for nearly 19 minutes, according to the safety board’s investigation. Mr. Huang put his hands on and off the wheel several times during that period and, in the final minute before the crash, the vehicle detected his hands on the wheel three times for a total of 34 seconds. It did not detect his hands on the wheel in the six seconds before impact.

Tesla’s event data recorders routinely collect a wide variety of information, such as location, speed, seatbelt status, the position of the driver’s seat, the rotation angle of the steering wheel and pressure on the accelerator pedal.

Mr. Huang had been traveling in his 2017 Tesla Model X sport utility vehicle on U.S. 101 in Mountain View when the car struck a median barrier at about 71 miles per hour. The speed limit was 65 m.p.h. The collision spun the car, which later hit two other vehicles and caught fire.

Mr. Huang had previously complained to family of problems with Autopilot along that stretch of highway, his brother told investigators. Data from the vehicle confirmed at least one similar episode near the area dividing the two highways, according to documents from the investigation.

The first known fatal crash with Autopilot in use occurred in May 2016 in Florida, when a Tesla failed to stop for a truck that was turning in front of it on a Florida highway. The vehicle hit the trailer, continued traveling underneath it and veered off the road. The driver of that car, Joshua Brown, was killed in the accident.

Both the N.T.S.B. and the traffic safety agency investigated that crash, but came to somewhat different conclusions. In January 2017, N.H.T.S.A. cleared Autopilot, finding that it had no defects and did not need to be recalled, though the agency called on automakers to clearly explain how such systems work to drivers. Nine months later, the safety board determined that while Autopilot worked as intended, it had nonetheless “played a major role” in the crash.

“The combined effects of human error and the lack of sufficient system controls resulted in a fatal collision that should not have happened,” Mr. Sumwalt said at the time.

That finding reflects a common critique of Autopilot — that it does not go far enough in forcing drivers to maintain their focus on the road. Unlike Autopilot, Super Cruise, a driver-assistance system offered by General Motors, works only on certain highways and tracks driver’s heads to make sure they are paying attention to the road.

Critics also say that Tesla and its chief executive, Elon Musk, have exaggerated Autopilot’s capabilities.

In 2018, for example, Mr. Musk was widely criticized for taking his hands off a Tesla Model 3 steering wheel while demonstrating Autopilot for the CBS News program “60 Minutes,” something the vehicle owner’s manual instructs drivers using Autopilot never to do.

In January, Mr. Musk told investors that Tesla’s “full self-driving capability” might be just a few months from having “some chance of going from your home to work, let’s say, with no interventions.”

Jason Levine, executive director of the Center for Auto Safety, an advocacy group, said that “by calling it Autopilot, by using terms like ‘full self-driving,’ Tesla is intentionally misleading consumers as to the capabilities of the technology.”

To avoid false expectations, German regulators reportedly asked Tesla in 2016 to stop using the term Autopilot, arguing that it suggests that the technology is more advanced than it really is.

Autonomous technology is commonly categorized into six levels, from zero to five, as defined by SAE International, an association of automotive engineers. Level 5 represents full autonomy in which a vehicle can perform all driving functions on its own, including navigating to a chosen destination. Autopilot and Super Cruise are considered Level 2 “partial automation” technologies, which enable a vehicle to control steering and braking and accelerating yet require the full attention of a human driver.

Evidence of drivers misusing Autopilot abound on the internet. And in a survey last year, the Insurance Institute for Highway Safety found that 48 percent of drivers believed it was safe to remove their hands from a steering wheel while using Autopilot. By comparison, 33 percent or fewer drivers said the same thing about similar systems in cars made by other automakers.

Latest Category Posts