Causes of Social Dilemma: A Case of Autonomous Vehicles

この論文をさがす

抄録

Artificial intelligence( AI) equipped in autonomous vehicles( AV) that drives fully automatically may have to face dilemmas to choose between two evils, such as running over pedestrians versus sacrificing themselves and their passengers in order to save them. This dilemma may also cause a social dilemma that if people have different ideas about “moral AVs” versus “wishing AVs” while wishing (or demanding) AVs are immoral, the market will eventually be filled with immoral AVs regardless of peoples' ascribed moralities. With an online survey giving 14,829 effective responses from all over Japan, we find that the social dilemma will occur in Japan as well as in the U.S. We also focused on the relationship between the morality and the relative intention to buy “moral AVs,” likewise exploring the factors of the social dilemma. We find that the more credible AVs are, the higher the likelihood that social dilemmas will occur. This finding implies that there is a possibility that such social dilemmas will not be resolved until the number of car accidents reaches zero.

収録刊行物

詳細情報 詳細情報について

問題の指摘

ページトップへ