Thursday, July 30, 2015

Module 5.4

How to Help Self-Driving Cars Make Ethical Decisions

      In an article from MIT Technology Review website, author Will Knight delves into the ethical questions raised by automatic cars.  Studies are now looking into the decisions that cars will have to make in the future.  Chris Gerdes, who is leading the study and is a professor with Stanford University posed the question "As we see this (driving) with human eyes, one of these obstacles has a lot more value than the other...What's the car's responsibility?"  Gerdes was referencing a recent scenario in which he proposed, "a child suddenly dashing into the road, forcing the self-driving car to choose between hitting the child or swerving into an oncoming van."
      That is the ethical dilemma that self-driving car programmers will have to decide.  Ultimately that will be an algorithm that is programmed into the car.  Something will have to be more or less valuable, depending on the ethics (or intelligence) of the car.  In the scenario above, how does the car decide which way to go?  Are the cars smart enough to talk to themselves and avoid each other and the child?  
      Lot's of questions still need raised and answered.  Before self-driving cars we could blame the individual, or call mishaps accidents.  In the future, I don't think we can call them "accidents" when the vehicles are programmed for a function.  Who would held be responsible in the event of a true accident?  The car programmer?  

1 comment:

  1. Great article Derek. I never really thought about an autonomous having to make decisions while in motion. It’s scary that a car like Stephen King’s “Christine” may actually become a reality. I have to admit it would be nice if a car could drive itself on a long road trip though.

    ReplyDelete