Visit us on:
Visit Us On FacebookVisit Us On TwitterVisit Us On YoutubeVisit Us On Instagram
Ainsley Schuler April 27, 2018

Submission by Jackson Miller

This story is about the first death caused by a self-driving car that occured in Tempe, Arizona. 49-year-old Elaine Herzberg was struck and killed by the car on March 18th while walking her bike across the street. The major dilemma in this story is who is at fault for the death of the pedestrian. The car was in autonomous(self-driving) mode at the time of the incident. However, there was a person sitting in the driver’s seat as it occurred. 44-year-old Rafael Vasquez was on his phone at the time the car hit the woman. The family wants to press criminal charges on Rafael and Uber.

This story has several immoral elements to it. First, it was immoral of Rafael to be on his phone while in the driver’s seat of the vehicle. Although the car was in self driving mode, Rafael should have still been aware of his surroundings and paying attention to the road. This car was being tested and he was in the car so that he could take over if something went wrong. Another argument is that these self-driving cars shouldn’t even be on the road unless they are completely ready. The main flaw in the current technology is that it can’t make split second decisions like a human can in pop-up incidents. It doesn’t have the power to differentiate between running into a human or a signpost if it came down to that. People would argue that it is immoral to put these cars on the road if they aren’t able to avoid hurting humans if they could.

Clearly someone will have charges pressed against them in this case. However, a utilitarian/ethical egoism approach can be taken to this case. Was having the car on the road in a testing mode the best idea for everyone? Was it truly for the greatest possible good? Uber’s intentions were Utilitarian because they were testing their technology that could potentially reduce accidents by a large margin according to many studies. However, the person in the driver’s seat expressed ethical egoism. He wasn’t paying attention and was acting in his own self-interest by being on his phone rather than doing his job.

In my opinion, I think the person in the driver’s seat should be mainly responsible. He knowingly wasn’t paying attention in a vehicle that was being tested. The technology isn’t where it needs to be for self-driving cars to be on the road unattended yet. While the vehicle malfunctioned which caused the accident, the “driver” was the one that could have saved the woman, but did not.