rest soon – even providing navigational help H u m a n i t i e s
Background
The concept of morality, understanding the difference between right and wrong and guiding one’s behavior around doing what’s right and avoiding what’s wrong, has long been of keen interest to psychologists. From the early work of Piaget and Kohlberg exploring how moral reasoning changes with age, this topic has captured much attention. Clearly, this attention has grown even more as we now enter an age where “intelligentmachines” are playing a greater role in guiding and, in some cases, fully determining an individual’s behavior.
One class of intelligent machines where this issue is receiving increased focus is “self-driving” or autonomous vehicles (AVs). All major automobile manufacturers are now offering features to assist drivers in the operation of their vehicles. These range from proximity sensors to automatic parking systems to lane-guidance alerts/corrections. In fact, some current automobiles go a step further, monitoring the times the driver strays from their driving lane and sending an alert suggesting the driver should stop for some rest soon – even providing navigational help to find a nearby rest area.
The ultimate goal appears to be a fully AV, one where all basic operations as well as critical driving decisions are made by the car. Most decisions a driver makes are based on agreed–upon rules of the road (e.g., stopping for a red light, keeping a safe distance from the car in front, parking between marked lines, obeying the speed limit, etc.). Knowledgeof these rules are universally required to obtain (and keep!) a driver’s license. However, there are other driving decisions that are not “rules-based.” Sometimes drivers must make split-second decisions where the life of the driver, their passenger(s), other drivers, and/or pedestrians may be in play. For example, if you are driving at night along a tree-lined road and see a person in the road ahead of you that you cannot avoid hitting, what do you do? Do you swerve off the road risking your life by hitting a tree and/or going over an embankment – or stay the course and save yourself but risk the pedestrian’s life by driving directly into them? These are situations that require you to make a moral decisionabout the value of lives in the context of a situation they may involve many variables. Some examples:
• The pedestrian is a child.
• You have your spouse in the car with you.
• A friend is in the car with you.
• You have your spouse and your two children in the car with you.
• The pedestrian is an old man.
• There are three pedestrians in the road.
• You are an elderly driver.
• You have your grandparents in the car with you.
The list of such variables is long – even longer when you consider combinations. Andthere is research suggesting responses to these variations are different for people fromvarious cultural, ethnic and socioeconomic backgrounds.
A key advantage of AVs is that they will be able to react to critical, life-threateningdriving situations in a split second – much faster than vehicles under human control, regardless of the level of attentiveness of the driver. Because of this, it is likely thatdecision choices will need to be “programmed” into the AV. In cases like the swerve-or-stay situation above, the programming may have to make a moral judgment about the value of lives of people involved. Realistically, the programming should follow some moral code that society agrees is the best option. However, is there such a “universalmoral code” that all AVs should be programmed to follow? And, if not, who should be responsible for determining the “proper” moral code for a specific situation? Research suggests that this intersection of AVs with what is right and wrong behavior in a driving accident involving potential personal injury and/or death will be one of the mostchallenging in the path to making fully AVs a reality.
Place this order or similar order and get an amazing discount. USE Discount code “GET20” for 20% discount