How can we teach machines a lesson in morality? One of the most interesting questions of our time.
The discussion about how autonomous driving can change our society, help us save time and ensure traffic security is at the heart of the current media. However, when we think about scenarios in which you won't have a steering wheel in your transport vehicle and something malfunctions, what does the car do? Does it come to a complete halt? What if the brakes won't work and the car has to decide to I crash into a wall, and the driver will die or do I drive into the next shopping mall in the hope that I won't hit anyone?