Teslas with partially automated driving systems are a step closer to being recalled after the U.S. elevated its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.The National Highway Traffic Safety Administration said Thursday that it is upgrading the Tesla probe to an engineering analysis, another sign of increased scrutiny of the electric vehicle maker and automated systems that perform at least some driving tasks.An engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there should be a recall or the probe should be closed.Documents posted Thursday by the agency raise some serious issues about Tesla’s Autopilot system. The agency found that it’s being used in areas where its capabilities are limited, and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle.The probe now covers 830,000 vehicles, almost everything that the Austin, Texas, carmaker has sold in the U.S. since the start of the 2014 model year.NHTSA reported that it has found 16 crashes into emergency vehicles and trucks with warning signs, causing 15 injuries and one death.Investigators will evaluate additional data, vehicle performance and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks undermining the effectiveness of the driver’s supervision,” the agency said.A message was left Thursday seeking comment from Tesla.In the majority of the 16 crashes, the Teslas issued forward collision alerts to the drivers just before impact. Automatic emergency braking intervened to at least slow the cars in about half the cases. On average, Autopilot gave up control of the Teslas less than a second before the crash, NHTSA said in documents detailing the probe.NHTSA also said it’s looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.The agency found that in many cases, drivers had their hands on the steering wheel as Tesla requires, yet they failed to take action to avoid a crash. This suggests that drivers are complying with Tesla’s system that makes them keep hands on the wheel, the agency wrote. Yet this doesn’t necessarily make sure they’re paying attention.In crashes were video is available, drivers should have seen first responder vehicles an average of eight seconds before impact, the agency wrote.The agency will have to decide if there is a safety defect with Autopilot before pursuing a recall.Video below: Tesla crashes into Ohio convention center causing $300,000 in damageInvestigators also wrote that a driver’s use or misuse of the driver monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”The agency document all but says Tesla’s method of making sure drivers pay attention isn’t good enough and that is a safety defect that should be recalled, said Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles.“It is really easy to have a hand on the wheel and be completely disengaged from driving,” he said. Monitoring a driver’s hand position is not effective because it only measures a physical position, he said. “It is not concerned with their mental capacity, their engagement or their ability to respond.”Similar systems from other companies such as General Motors’ Super Cruise use infrared cameras to make sure a driver is looking forward. But even these may still allow a driver to zone out, Walker Smith said.“This is confirmed in study after study,” he said. “This is established fact that people can look engaged and not be engaged. You can have your hand on the wheel and you can be looking forward and not have the situational awareness that’s required.”In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. Of the remaining 106, the main cause of about one-quarter of the crashes appears to be running Autopilot in areas where it has limitations, or conditions can interfere with its operations. “For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice,” the agency wrote.Other automakers limit use of their systems to limited-access divided highways.The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has yet to take action on the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies. In 2020 the NTSB blamed Tesla, drivers and lax regulation by NHTSA for two collisions in which Teslas on Autopilot crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.In a statement, NHTSA said there aren’t any vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” the agency said.Driver-assist systems can help avoid crashes but must be used correctly and responsibly, the agency said.NHTSA began its inquiry in August of last year after a string of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board, or cones warning of hazards.

Teslas with partially automated driving systems are a step closer to being recalled after the U.S. elevated its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs.

The National Highway Traffic Safety Administration said Thursday that it is upgrading the Tesla probe to an engineering analysis, another sign of increased scrutiny of the electric vehicle maker and automated systems that perform at least some driving tasks.

An engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there should be a recall or the probe should be closed.

Documents posted Thursday by the agency raise some serious issues about Tesla’s Autopilot system. The agency found that it’s being used in areas where its capabilities are limited, and that many drivers aren’t taking action to avoid crashes despite warnings from the vehicle.

The probe now covers 830,000 vehicles, almost everything that the Austin, Texas, carmaker has sold in the U.S. since the start of the 2014 model year.

NHTSA reported that it has found 16 crashes into emergency vehicles and trucks with warning signs, causing 15 injuries and one death.

Investigators will evaluate additional data, vehicle performance and “explore the degree to which Autopilot and associated Tesla systems may exacerbate human factors or behavioral safety risks undermining the effectiveness of the driver’s supervision,” the agency said.

A message was left Thursday seeking comment from Tesla.

In the majority of the 16 crashes, the Teslas issued forward collision alerts to the drivers just before impact. Automatic emergency braking intervened to at least slow the cars in about half the cases. On average, Autopilot gave up control of the Teslas less than a second before the crash, NHTSA said in documents detailing the probe.

NHTSA also said it’s looking into crashes involving similar patterns that did not include emergency vehicles or trucks with warning signs.

The agency found that in many cases, drivers had their hands on the steering wheel as Tesla requires, yet they failed to take action to avoid a crash. This suggests that drivers are complying with Tesla’s system that makes them keep hands on the wheel, the agency wrote. Yet this doesn’t necessarily make sure they’re paying attention.

In crashes were video is available, drivers should have seen first responder vehicles an average of eight seconds before impact, the agency wrote.

The agency will have to decide if there is a safety defect with Autopilot before pursuing a recall.

Video below: Tesla crashes into Ohio convention center causing $300,000 in damage

Investigators also wrote that a driver’s use or misuse of the driver monitoring system “or operation of a vehicle in an unintended manner does not necessarily preclude a system defect.”

The agency document all but says Tesla’s method of making sure drivers pay attention isn’t good enough and that is a safety defect that should be recalled, said Bryant Walker Smith, a University of South Carolina law professor who studies automated vehicles.

“It is really easy to have a hand on the wheel and be completely disengaged from driving,” he said. Monitoring a driver’s hand position is not effective because it only measures a physical position, he said. “It is not concerned with their mental capacity, their engagement or their ability to respond.”

Similar systems from other companies such as General Motors’ Super Cruise use infrared cameras to make sure a driver is looking forward. But even these may still allow a driver to zone out, Walker Smith said.

“This is confirmed in study after study,” he said. “This is established fact that people can look engaged and not be engaged. You can have your hand on the wheel and you can be looking forward and not have the situational awareness that’s required.”

In total, the agency looked at 191 crashes but removed 85 of them because other drivers were involved or there wasn’t enough information to do a definite assessment. Of the remaining 106, the main cause of about one-quarter of the crashes appears to be running Autopilot in areas where it has limitations, or conditions can interfere with its operations. “For example, operation on roadways other than limited access highways, or operation in low traction or visibility environments such as rain, snow or ice,” the agency wrote.

Other automakers limit use of their systems to limited-access divided highways.

The National Transportation Safety Board, which also has investigated some of the Tesla crashes dating to 2016, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate. The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has yet to take action on the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies.

In 2020 the NTSB blamed Tesla, drivers and lax regulation by NHTSA for two collisions in which Teslas on Autopilot crashed beneath crossing tractor-trailers. The NTSB took the unusual step of accusing NHTSA of contributing to the crash for failing to make sure automakers put safeguards in place to limit use of electronic driving systems.

The agency made the determinations after investigating a 2019 crash in Delray Beach, Florida, in which the 50-year-old driver of a Tesla Model 3 was killed. The car was driving on Autopilot when neither the driver nor the Autopilot system braked or tried to avoid a tractor-trailer crossing in its path.

In a statement, NHTSA said there aren’t any vehicles available for purchase today that can drive themselves. “Every available vehicle requires the human driver to be in control at all times, and all state laws hold the human driver responsible for operation of their vehicles,” the agency said.

Driver-assist systems can help avoid crashes but must be used correctly and responsibly, the agency said.

NHTSA began its inquiry in August of last year after a string of crashes since 2018 in which Teslas using the company’s Autopilot or Traffic Aware Cruise Control systems hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board, or cones warning of hazards.



Source link

By admin

Malcare WordPress Security