قالب وردپرس درنا توس
Home / Business / Another driver died in a Tesla that was on autopilot.

Another driver died in a Tesla that was on autopilot.



  An advertisement for Tesla's autopilot.

Autopilot may not be the seller

Arnd Wiegmann / Reuters

The Friday test Tesla's activation of its autopilot mode in a fatal accident on a Model X SUV earlier this month is bad for Tesla, bad for the thing of self-driving cars, and certainly bad for anyone driving in semi-autonomous or autonomous vehicles or the Street with them shares. On March 23, the vehicle hit a barrier on Highway 101 near Mountain View, California, then caught fire and was hit by two other vehicles. The driver died. In an earlier statement about the incident, before Tesla was able to retrieve the SUV's logbooks, the electric vehicle company was proactively defensive of its technology, stressing that while the autopilot can not prevent all accidents, it is "less likely." Friday Night Update provides a more complete, but by no means complete, picture of what happened:

In the moments before the collision, which took place on Friday, March 23 at 9:27 am, the autopilot was set to minimum with the following Adaptive Cruise Control Distance Control. The driver had received several visual and audible hands-on warnings earlier in the ride and the driver's hands were not detected on the steering wheel for six seconds prior to the collision. The driver had approximately five seconds and 150 meters of unobstructed view of the concrete partition with the crushed impact absorber, but the vehicle records show that no action was taken.

The reason why this accident was so heavy is because the crash cushion, a safety barrier that was developed to reduce the impact into a concrete roadway divider, had been crushed in a previous accident without being replaced. In another crash, we've never seen this level of damage to a Model X before.

According to the Wall Street Journal, the National Transportation Safety Board is investigating the incident.

The accident occurred five days after a self-driving Uber in Arizona had killed a pedestrian on the night-time Uber to suspend its tests of vehicles everywhere, and Arizona to suspend testing in the state by Uber. The first accident caused by a self-driving car has sparked louder demands for strict regulation of emerging technology as companies strive to perfect it.

Tesla's autopilot is a less sophisticated technology than the fully autonomous systems that Uber, Google's sister company Waymo, and others develop, one that should not be used without an alarm driver. But that did not stop the Tesla owners from ruthlessly engaging in the feature. That's another important difference between Tesla's tech and Uber's: Anyone who can afford a Tesla can take advantage of it.

The March 23 accident is the second known fatal incident involving Tesla's autopilot after a Model S collided with a semi-trailer in 2016, killing the pilot's driver. In this accident, like last week, the driver does not seem to have his hands on the wheel, and the NTSB blamed the death between the flaws of Tesla's technology and the tractor driver.

We still do not know all the details of the crash on Highway 101 – like the driver and if the autopilot has failed in any way – though he's certainly being scrutinized by Tesla at a time when there are many aspects Business is looking for weaknesses. The company's share was hit throughout the week as bad news continued to mount for the maker. On Thursday, he recalled 123,000 Type S vehicles – almost half of all cars ever sold – a problem with retrofitted power steering screws. Most worrying for the company is that it is still struggling with production delays for the Tesla mass-market electric car Model 3, whose production time was overestimated by CEO Elon Musk. And there is reason to believe that the company runs out of money.

All of this could explain why Tesla's recent crash update on March 23 stresses regret over lost lives, and contains more than a hint of despair over another case in which the company must defend a technology that it believes. probably rightly, will improve traffic safety once it's perfected:

In the past, when we established statistical safety points, we were criticized for it, which means that we have no empathy for the recent tragedy. Nothing is further from the truth. We care very much for those who have decided to give us their trust. But we also have to look after people who can save their lives now and in the future if they know that autopilot improves safety. None of this changes how devastating an event like this is or how much we feel for the family and friends of our clients. We are incredibly sorry for her loss.

But it's also possible to believe in the promise of self-driving cars while worrying about how the technology is being used – especially the idea that one human driver and one self-driving vehicle could explain each other's weaknesses. The latter is still inexperienced, while the former is too lazy to inattentiveness. And it is becoming increasingly obvious that the combination of the two can sometimes be deadly.


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *