By Jessie Levin (PO ’18)
As a future which includes autonomous cars becomes a reality, the manner in which car accidents are handled will need to be reworked. On March 18th, 2018, a woman walking her bike along the the road in Tempe, Arizona was killed when she was hit by an autonomous Uber. Despite the fact that the Uber had a human in the driver’s seat as an extra precaution, the accident still occurred. In this case, there is ambiguity as to who takes the responsibility for the vehicular manslaughter. Should the “driver” or the program that guided the autonomous vehicle be at fault? Although autonomous cars appear to be safer than general driving for both large scale crashes and small fender benders, as they do not get distracted and can react faster than humans, the technology has not been perfected to ensure absolute safety for those on the roads at this time. While some critics argue that this technology should no longer be commercialized due to this fatal mistake, federal transportation agencies and technology companies should instead learn from this mistake and implement stricter regulations regarding procedures following accidents to make safer autonomous cars a reality for the future. In this way, an already safer technology can be improved.
Of the eight years that autonomous cars have been tested on the roads, only one other fatal accident has occurred in the U.S.. This accident accident occurred in Florida in 2016 because a Tesla on autopilot collided with a tractor trailer when the autopilot system failed to recognize the difference between a brightly lit sky and a white tractor-trailer. Excluding this instance, most accidents involving autonomous vehicles have been attributed to human error rather than the autonomous cars. For example, driverless cars have been involved in accidents where they have been rear ended by a regular vehicle.
Current regulations surrounding autonomous vehicles have recently become centralized under the SELF-DRIVE Act of 2017. This bill gives the National Highway Traffic Safety Administration (NHTSA) the authority to regulate autonomous vehicles’ design and performance. Within two years of the passing of this ACT, NHTSA has to come up with safety rules for manufacturers to comply with. Within one year of its passing, NHTSA has to come up with performance standards. In the meantime, manufacturers have to comply with state laws. Thirty states that have passed legislation surrounding autonomous cars with varying degrees of strictness. Arizona, where this vehicular manslaughter occured, is one of the most liberal of the states and allows cars to operate without a safety driver.
These autonomous vehicle programs work by taking inputs about the road surroundings and processes these inputs into an action plan which connects to other drivers or infrastructures. As a part of the processing, the programs keep logs of the conditions leading up to accidents which then can be reviewed to better the program.
Placing stricter penalties that follow accidents involving autonomous vehicles will still allow autonomous vehicles to stay on the road while ensuring citizens’ safety. Ensuring that the program can be modified to reflect lessons learned from their mistakes is no different than the process in which new drivers learn about driving. Additionally, it is in the best interest for companies to learn from their mistakes so logically they would not oppose stricter regulations. No company wants to be in the news again for another mistake so they will do their best to prevent future accidents.
Since 64% of the American public is concerned about sharing the road with autonomous vehicles, gaining the public’s support is a large but important task. Beyond being comfortable with sharing the road with autonomous vehicles, this would also force the public to accept the fact that the future will include autonomous vehicles and the social, economic, and political implications of their implementation.
While autonomous vehicles have the promise to change our transportation system and lead to safer roads, they need to be thoroughly tested and vetted. Unfortunate accidents such as the recent Uber collision highlight the need to learn from mistakes in the system and learn from them. But, even the best autonomous vehicle software will go nowhere unless the society accepts it. This will take strong efforts to get people to accept it. While there are many ways in which to bridge the gap, giving people a sense of ownership and information on how it will affect their futures will help to ease fears about the unknown.