Abstract
This article addresses a dilemma about autonomous vehicles: how to respond to trade-off scenarios in which all possible responses involve the loss of life but there is a choice about whose life or lives are lost. I consider four options: kill fewer people, protect passengers, equal concern for survival, and recognize everyone's interests. I solve this dilemma via what I call the new trolley problem, which seeks a rationale for the intuition that it is unethical to kill a smaller number of people to avoid killing a greater number of people based on numbers alone. I argue that killing a smaller number of people to avoid killing a greater number of people based on numbers alone is unethical because it disrespects the humanity of the individuals in the smaller-numbered group. I defend the recognize-everyone's-interests algorithm, which will probably kill fewer people but will not do so based on numbers alone.
Original language | English (US) |
---|---|
Pages (from-to) | 450-473 |
Number of pages | 24 |
Journal | Business Ethics Quarterly |
Volume | 31 |
Issue number | 3 |
DOIs | |
State | Published - Jul 2021 |
All Science Journal Classification (ASJC) codes
- Business, Management and Accounting(all)
- Philosophy
- Economics and Econometrics
Keywords
- algorithms
- autonomous vehicles
- respect for humanity
- trade-off problems
- trolley problem