Write 1.5-2 pages, typed, double-spaced on the ethical dilemma below. 25 points possible. I will grade it according to 5 point increments (25, 20, 15, 10, 5, 0), since the assessment will be a general impression of the overall work. Any points you earn will be added on as extra points to your exam point total for the semester. You do not need to cite this, but if there is ANY hint of copying and pasting from any source (this includes quotes), you will get 0 credit for the assignment. That is, this must be your completely original understanding and writing from start to finish. Upload your assignment using one of the following file types only: doc, docx, or pdf.
The Ethical Dilemma of Autonomous Cars
The dilemma is this: it is said that autonomous cars will reduce the number of accidents; however, if a life-and-death situation arises, the car needs to be programmed to make a moral decision. The moral decision is utilitarian in nature, as the program will seek to do the least amount of harm. Normally, utilitarian decisions work on the judgment of the value of human life, person X is not as valuable as person Y. Given the way humans make decisions, with perception, values, experience, etc. all playing a role, there seems to be no way that computer programs can model human decision making, so the program will be forced to judge strictly on numbers. There is also the question of responsibility when these decisions are made.
Here are a few articles on the ethical dilemma of self-driving cars (feel free to read more):
https://www.theatlantic.com/technology/archive/2013/10/the-ethics-of-autonomous-cars/280360/
https://theconversation.com/the-everyday-ethical-challenges-of-self-driving-cars-92710
https://www.theglobeandmail.com/globe-drive/culture/technology/the-ethical-dilemmas-of-self-drivingcars/article37803470/
So for this assignment, discuss the moral decision making process of autonomous cars, especially contrasted with human decision making. Then, discuss whether you think we should have them or not and why. Regardless of your answer to whether we should have them or not, consider the question of responsibility. This is not just legal responsibility. For instance, should an autonomous car be more responsible to its owner than others (egoism)? Should autonomous cars be entirely public? Does responsibility fall on the owner, manufacturer or algorithm writer? So three good-sized paragraphs (decision making, have them or not, and responsibility).