The World’s First Fatal Self-Driving Car Accident: Machine and Human to BlameMobility
Liability for accidents involving autonomous vehicles accidents remains a developing area for protocols, regulations, and insurance policies, in anticipation of a future when fully autonomous cars will roam our streets
How Do Cars Drive Themselves?
IoT sensors, algorithms, and connectivity are the leading technologies that make it possible for cars to drive with minimal to zero human intervention.
A combination of radars, cameras, and LIDAR guides the car and helps it to navigate safely with no collisions. The internet connection helps to better monitor the surroundings, such as nearby cars, data, weather, maps, etc. When it comes to decision making, the control algorithms and software analyze all the collected data to determine the best action to take.
Currently, there are plenty of cars on the road with self-driving features and technologies; but the apogee of the autonomous technology is yet to be reached.
The Society of Automobile Engineers (SAE) has set six levels for driving automation. The globally recognized standards are as the following:
- Level 0: No automation.
- Level 1: Driving Assistance.
- Level 2: Partial Automation.
- Level 3: Conditional Automation.
- Level 4: High Automation.
- Level 5: Full Automation.
The Level 1 capabilities have been around since the 1950s with functions like cruise control and ABS. Level 2 capabilities started to emerge in the 2000s with partial automation of one feature at a time, such as automatic emergency braking.
The current state of self-driving cars on the streets corresponds to level 3 capabilities, where two or more simultaneous driving functions are possible, like cruise control and lane-keeping.
Level 4 of autonomous driving is still under rigorous testing where most of the driving is automated, while the passenger is enjoying the ride. Some experts believe that level 4 automated cars may be publicly available within the coming decade, after passing all the regulatory checks.
Level 5 corresponds to a fully automated car where the machine controls the driving with no human intervention. Giant auto manufacturers, tech companies, and hail riding service providers are racing towards creating the ultimate Level 5 fully autonomous vehicle.
This driving experience is highly futuristic, and we might not experience it before 2060!
While we are still far away from building a fully self-driving car, several accidents involving semi-autonomous vehicles have already proven that this technology needs a carefully crafted and comprehensive regulation to deliver on its promises, especially when people’s lives are at stake.
But the first human death by a self-driving car was the first practical test of assigning accountability to self-driving vehicles.
A year after the 2018 fatal Uber crash in Tempe, Arizona, the investigation led by the National Transportation Safety Board in the US, revealed that both a human mistake and a machine error caused the fatality.
The NTSB report said that the crewed Uber vehicle had been in autonomous mode for 19 minutes and was driving at about 40 mph when it hit 49-year-old Elaine Herzberg as she was walking her bike across the street.
The car’s radar and lidar sensors detected Herzberg about six seconds before the crash. They first identified her as an unknown object, then as a vehicle, and then as a bicycle, each time adjusting its expectations for her path of travel.
The investigators’ findings blamed the ride-hailing company, the safety driver in the car, the victim, and the state of Arizona altogether.
The investigation report mentioned “the failure of the Uber self-driving vehicle operator, Rafaela Vasquez, to monitor the road and the automated driving system,” and said that she was “visually distracted throughout the trip by her personal cell phone.”
At the same time, the investigation called out the “inadequate safety risk assessment procedures” at Uber’s Advanced Technologies Group, especially regarding the “lack of a safety division within ATG.”
It also blamed Uber’s “ineffective” monitoring of vehicle operators, that was unable to ensure that drivers were not violating the company’s policies.
The report also mentioned that the victim, Elaine Herzberg, was found to have methamphetamines in her system, and her impairment may have led her to cross the street outside the crosswalk, thus leading to her death.
Lastly, Arizona’s “insufficient” policies to regulate automated vehicles on its public roads were found to have contributed to the crash, according to the investigation.