In the realm of robotic ethics, the trolley problem is a thought experiment that raises questions about the morality of a robot's actions.
Imagine a trolley (a large, robotic tram) is headed towards a group of five people. You are the robot's programmer, and you have the power to divert the trolley onto a track that will save the five people, but sacrifice one person on the other track.
Do you divert the trolley?