Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash


Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash

Advertisement

Tesla Autopilot NHTSA

Federal regulators spent the past six months investigating the role of Tesla Motors’ Autopilot feature in a fatal car crash. Their findings reinforce what millions of drivers already know: Despite much hype about “self-driving cars,” human beings remain responsible for understanding the capabilities as well as limitations of the vehicles they drive as well as accountable for their safe operation.

Officials through the National Highway Traffic Safety Administration (NHTSA) closed their probe of Tesla on January 19 without ordering a recall or taking any enforcement action. The examination found no faults within the Autopilot system as well as determined of which This specific worked as intended during a May 7, 2016, crash of which claimed the life of Joshua Brown, the first person killed in a crash attributable to a semi-autonomous feature.

“Not all systems can do all things,” NHTSA spokesman Bryan Thomas said Thursday.

Brown had engaged the Autopilot feature in his 2015 Tesla style S as he traveled eastward along U.S. Highway 27 in Williston, Florida. Neither he nor the autonomous technology noticed when a tractor trailer made a left turn across the auto’s path. The truck should have been visible to the driver for at least seven seconds before the fatal collision occurred, according to NHTSA’s summary of the investigation, enough time to notice of which the auto was not reacting to the hazard as well as to take evasive action.

In September, Tesla made adjustments to its Autopilot feature of which emphasize the role of radar as well as cameras in object detection, improvements of which company president Elon Musk said he believed could have prevented Brown’s death by generating the auto capable of detecting the obstacle ahead of This specific in time to avoid or mitigate the collision. Tesla’s adjustments also shortened the period of time during which drivers could remove their hands through the steering wheel. Currently, drivers who do not respond to cues to keep their hands on the wheel three times can lose Autopilot functionality for the remainder of a journey.

although Thomas made clear of which—even if Tesla had not upgraded its Autopilot system—no recall would certainly have been ordered for the 43,781 style S as well as style X vehicles of which contain Autopilot, because no defects had been found during the investigation.

Employees of NHTSA’s Office of Defects Investigation reviewed “dozens” of Tesla-involved crashes of style S or style X vehicles in which Autopilot was either engaged or had been engaged within the 15 seconds preceding a collision. Only two of these crashes resulted in a death or serious injury, with the latter being a rollover on the Pennsylvania Turnpike on July 1, 2016, in which two people were seriously injured.

“The very name Autopilot creates the impression
of which a Tesla can drive itself. This specific can’t.”
– John Simpson, Consumer Watchdog

As NHTSA scrutinizes the role of the technology in these crashes, This specific was also quick to note the promise of enhanced safety provided by semi-autonomous systems. The agency says of which in reviewing data furnished by Tesla as part of the investigation, This specific found of which vehicles equipped with Autosteer, a component of the Autopilot system, reduced crash rates by 40 percent through their pre-installation levels. Before Autosteer, the vehicles had a rate of 1.3 crashes per million miles of travel; afterward, the rate fell to 0.8 per million miles.

“We appreciate the thoroughness of NHTSA’s report as well as its conclusion,” a Tesla spokesperson said in a written statement.

The Florida crash as well as its circumstances encapsulate many thorny issues of which industry engineers as well as government regulators are grappling with both inside the near term, as driver-assistance features spread across the nation’s fleet, as well as further down the road, during a transition toward more highly automated driving. Among those challenges: figuring out how motorists as well as machines exchange control while avoiding “mode confusion,” as well as ensuring of which motorists understand how these systems work while safeguarding against misuse.

generating History Happen

John Simpson, privacy director at Consumer Watchdog, a California-based nonprofit of which advocates for customer rights, says there’s too much blaming of the driver in NHTSA’s findings.

“NHTSA has wrongly accepted Tesla’s line as well as blamed the human, rather than the ‘Autopilot’ technology as well as Tesla’s aggressive marketing,” he said. “The very name ‘Autopilot creates the impression of which a Tesla can drive itself. This specific can’t. Some people who apparently believed Tesla’s hype got killed.”

While Thursday’s findings lay the brunt of responsibility on the person behind the wheel, of which does not mean automakers are off the hook.

NHTSA, which issued a Federal Automated Vehicles Policy in September, criticized Tesla as well as additional companies for marketing slogans as well as brand names of features, such as Autopilot, of which may misrepresent their systems’ capabilities. Under the agency’s definitions of autonomy, Autopilot is actually classified as a Level 2 technology, one in which an automated system can conduct some parts of the driving task while humans monitor the broader driving environment as well as perform remaining driving tasks.

“The department has been leaning forward on automated technologies because we believe they have great potential
to save lives.” – Bryan Thomas, NHTSA

This specific’s not enough, Thomas said, for automakers to describe the system operations in an owner’s manual, as well as NHTSA’s investigation found Tesla’s manual was “not as specific as This specific could be.” although This specific’s also not enough for automakers to assume drivers will use features as they’re intended. NHTSA says they must account for how customers could potentially misuse semi-autonomous technology.

Broadly, NHTSA has taken a keen interest inside the exchange of control between motorists as well as semi-autonomous systems.

In October, the agency sent a letter to Comma.ai stating of which its aftermarket product would certainly put “the safety of your customers as well as additional road users at risk.,” The company asserted of which its Comma One—a device intended to make autonomous-driving features available on cars not originally equipped with such technology—did not remove any human responsibilities through the driving task, although NHTSA said the warning was “insufficient.” The company opted to stop offering the device. In November, the agency cautioned General Motors of which its plans to allow its forthcoming Super Cruise feature to stop a vehicle inside the middle of roadways might present a danger to motorists therefore be considered a safety defect.

Tesla Update v7.0 Enables Self-driving Test In China

For today, Tesla, the third company using a semi-autonomous system of which attracted regulatory scrutiny, has passed muster. although NHTSA will continue to monitor the technology in general, paying particular attention to issues with handoffs of control at Level 2 as well as Level 3 autonomy.

“The department has been leaning forward on automated technologies because we believe they have great potential to save lives,” Thomas said. “At the same time, the department will aggressively oversee brand new technologies being put on the road.”

Later, Thomas said, “These are assistance systems of which require the driver inside the loop at all times . . . These are complicated issues, as well as we have ongoing research. We are interested in working with the industry in a collaborative way.”

Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash

Source: Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash

Related Post :

    Pictures gallery of Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash

    Advertisement
    Investigation Closed: Feds Take No Action in Fatal Tesla Autopilot Crash | admin | 4.5