Tesla Autopilot limitations played big role in fatal crash - Action News
Home WebMail Sunday, November 10, 2024, 07:23 PM | Calgary | 0.9°C | Regions Advertise Login | Our platform is in maintenance mode. Some URLs may not be available. |
Science

Tesla Autopilot limitations played big role in fatal crash

Design limitations of the Tesla Model S's Autopilot played a major role in the first known fatal crash of a highway vehicle operating under automated control systems, the U.S. National Transportation Safety Board says.

NTSB recommends safeguards to keep drivers engaged, prevent autopilot use beyond intended situations

This image provided by the National Transportation Safety Board shows the damage to the left front of the Tesla involved in a May 7, 2016, crash in Williston, Fla. that killed Joshua Brown, 40, of Canton, Ohio. He was using the semiautonomous driving systems of his Tesla Model S sedan. (NTSB via Associated Press)

Design limitations of the Tesla Model S's Autopilot played a major role in the first known fatal crash of a highway vehicle operating under automated control systems, the National Transportation Safety Board said Tuesday.

The board said the direct cause of the crash was an inattentive Tesla driver's over reliance on technology and a truck driver who made a left-hand turn in front of the car. But the board also recommended that automakers incorporate safeguards that keep drivers' attention engaged and that limit the use of automated systems to the conditions for which they were designed.

In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environments.- National Transportation Safety Board

Joshua Brown, 40, of Canton, Ohio, was traveling on a divided highway near Gainesville, Florida, using the Tesla's automated driving systems when he was killed.

Tesla had told Model S owners the automated systems should only be used on limited-access highways, which are primarily interstates. But the company didn't incorporate protections against their use on other types of roads, the board found. Despite upgrades since the May 2016 crash, Tesla has still not incorporated such protections, NTSB Chairman Robert Sumwalt said.

"In this crash, Tesla's system worked as designed, but it was designed to perform limited tasks in a limited range of environments," he said. "Tesla allowed the driver to use the system outside of the environment for which it was designed."

The result, Sumwalt said, was a collision "that should never have happened."

A woman demonstrates new Autopilot features in a Tesla Model S in 2015. The NTSB recommends manufacturers develop systems for ensuring operators remain attentive to the vehicle's performance when using semi-autonomous driving systems. (Beck Diefenbach/Reuters)

In a statement, Tesla said "we appreciate the NTSB's analysis of last year's tragic accident and we will evaluate their recommendations as we continue to evolve our technology." The company added that overall its automated driving systems, called Autopilot, improve safety.

NTSB directed its recommendations to automakers generally, rather than just Tesla, saying the oversight is an industrywide problem. Manufacturers should be able to use GPS mapping systems to create such safeguards, Sumwalt said.

No hands on wheel

Manufacturers should also develop systems for ensuring operators remain attentive to the vehicle's performance when using semi-autonomous driving systems other than detecting the pressure of hands on the steering wheeling, the NTSB recommended. Brown had his hands on the sedan's steering wheel for only 25 seconds out of the 37.5 minutes the vehicle's cruise control and lane-keeping systems were in use prior to the crash, investigators found.

With autopilot mode activated, the Model S Tesla keeps an ideal braking distance between itself and the vehicle ahead. Investigators found that the sedan's cameras and radar weren't capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions. (CBC)

As a consequence, Brown's attention wandered and he didn't detect the semitrailer in his path, they said.

The Model S is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously in nearly all circumstances. Level 2 automation systems are generally limited to use on interstate highways, which don't have intersections. Drivers are supposed to continuously monitor vehicle performance and be ready to take control if necessary.

Investigators found that the sedan's cameras and radar weren't capable of detecting a vehicle turning into its path. Rather, the systems are designed to detect vehicles they are following to prevent rear-end collisions. The board re-issued previous recommendations that the government require all new cars and trucks to be equipped with technology that wirelessly transmits the vehicles' location, speed, heading and other information to other vehicles in order to prevent collisions.

Last December, the Obama administration proposed that new vehicles be able to wirelessly communicate with each other, with traffic lights and with other roadway infrastructure. Automakers were generally supportive of the proposal, but it hasn't been acted on by the Trump administration.

Family defends Tesla

Brown's family defended his actions and Tesla in a statement released Monday. Brown was a technology geek and enthusiastic fan of the Model S who posted videos about the car and spoke to gatherings at Tesla stores. "Nobody wants tragedy to touch their family, but expecting to identify all limitations of an emerging technology and expecting perfection is not feasible either," the statement said.

The National Highway Traffic Safety Administration, which regulates auto safety, declined this year to issue a recall or fine Tesla as a result of the crash, but it warned automakers they aren't to treat semiautonomous cars as if they were fully self-driving.

The Model S is a level 2 on a self-driving scale of 0 to 5. Level 5 vehicles can operate autonomously in nearly all circumstances. Level 2 automation systems are generally limited to use on interstate highways, which don't have intersections. (Lucy Nicholson/Reuters)

While the NTSB was meeting to consider the Tesla crash, Transportation Secretary Elaine Chao was in Michigan unveiling new self-driving car safety guidelines for automakers. The guidelines encourage companies to put in place broad safety goals, such as making sure drivers are paying attention while using advanced assist systems. The systems are expected to detect and respond to people and objects both in and out of its travel path "including pedestrians, bicyclists, animals, and objects that could affect safe operation of the vehicle," the guidelines say.

There is a 12-point safety checklist, but the government makes it clear that the guidelines are voluntary and not regulations.