Self-Driving Uber Has First Fatal Accident

It seemed as if the movement to create the first fully functional self-driving car was coming along quickly, but earlier this week, a devastating incident created a setback when it was reported that one of Uber’s self-driving prototypes killed a pedestrian in Tempe, Arizona.

ALTHOUGH POLICE ARE STILL INVESTIGATING, it seems that a woman was struck after walking in front of the oncoming Uber. As of yet, there is no findings as to if the vehicle had any braking prior to the collision.

Not only is this a very sad moment for the woman’s family and friends, but it also presents an alarming turn of events for the automotive industry intent on launching the self-driving car in the near future.

There has been a lot of support behind this technology, especially since it is assumed that there will be a human behind the wheel, even when the car is moving autonomously.

Unfortunately, there was also a human behind the wheel when the car struck the woman, which means that even with human intervention, the self-driving vehicles could still be potentially dangerous.  However, until a full investigation is completed, it is unclear whether the collision was avoidable whether the vehicle was  human driven with no automation or fully automated as it was in this instance.

This is not the first time that a self-driving car has malfunctioned. In 2016, a self-driving Tesla vehicle crashed into a truck, killing the Tesla driver.

Experts are now realizing the fact that these hi-tech cars may have been deployed too quickly, including Bryan Reimer of MIT who says that, “until we understand the testing and deployment of these systems further, we need to take our time and work through the evolution of the technology.”

In addition to this, merely stating there needs to be a human behind the wheel ignores the Human Factors of someone behind the wheel but not actively engaged in the driving process. One needs only to observe a passenger in a vehicle who is not engaged in anything besides being a passenger – they soon tend to fall asleep.

This incident also brings out critics who call on those developing AI-based vehicles to think about how the cars could impact human lives, along with other legal and ethical implications.

Now, comes the hard part, figuring out what went wrong. Some experts believe that there may have been an issue with the vehicle’s sensors which failed to recognize what it was seeing when it scanned the human victim.

Of course, more testing is needed to figure out all of the potential and sometimes dangerous quirks of these self-driving cars before they are officially released into the public. However, another tragic incident like this could work to permanently derail any potential successes.

At 4N6XPRT Systems, our powerful accident reconstruction tools like Expert AutoStats have helped insurance companies, law enforcement agencies and attorneys to establish fault in car accidents for more than 20 years. Contact us to learn more about any of our software tools today.