The Trolley Problem

I’m sure most of you are familiar with the trolley problem. It’s fairly famous in our cultural imagination by now, so even if you think you aren’t aware of it, I bet it will sound familiar.

Briefly: imagine an out-of-control trolley speeding down the track. In front of it, tied to the line, are five people. Between the trolley and the five is a switch. You aren’t close enough to untie anyone, but you are close enough to flip the switch before the trolley gets there and save all five.

The problem is, the line you’d switch the trolley to has one person tied to it. So if you flip the switch, you are condemning that person to death.

Do you choose to save the five and condemn the one, or do you choose not to act and let five die? Surprisingly, many people choose not to act, sparing themselves the knowledge that they directly caused someone’s death, even if their inaction allows five to die.

There are countless variations meant to complicate the choice: what if the one person is a child and the five are elderly? What if the one is a woman and the five are men? What if the one is an innocent person while the five are criminals? And on and on.

In the end, it always boils down to the same brutal decision: deciding who lives and who dies. Will you directly kill one, or indirectly allow many to die? It is the quintessential no-win scenario.

And far from being a pointless philosophical exercise, the trolley problem has real-world applications. Law enforcement officers can be thrust into sudden situations where they must decide instantly, without the luxury of abstract debate. If a cop is thrown into a situation like this, he doesn’t have time to sit and think it through — he has to act instantly!

Driverless cars bring the problem into our own lives. Imagine your car is cruising when an unavoidable accident occurs. Staying the course will kill you, slamming on the brakes will kill the driver behind you, and veering left or right will kill pedestrians. Someone dies no matter what is chosen. So what should the AI decide? Should it scan ages, calculate probabilities, and sacrifice the oldest? Or something else?

This is real life: there is no comic book cheat code. You can’t somehow stop time or speed up and save everyone. Someone has to die.

Most people would rather not think about the trolley problem. It’s uncomfortable, and it feels safer to believe we’ll never face it. But technology and circumstance suggest otherwise.

So — what is your answer to the trolley problem?

Hi there! David is an American teacher and translator lost in Japan, trying to capture the beauty of this country one photo at a time and searching for the perfect haiku. He blogs here and at laspina.org. Write him on Bluesky.

【Support @dbooster with Hive SBI】



0
0
0.000
25 comments
avatar

You figure out how to save everyone...... fast

0
0
0.000
avatar

The "superman" solution isn't really a solution.

0
0
0.000
avatar

You have to ask.. "Why are they tied to the rail in the first place?" That means there are People there.. If you have time to ask why they are there then you probably have time to save them all.

Been there .. Done that in real life.. Everyone lived

0
0
0.000
avatar

Humm! Like you have said already, you can't pause time. In situations like this, everything happens so quickly. But hey! I can't think of what to do at the moment, of course I really hope I don't encounter such.

0
0
0.000
avatar

That reminds me of iRobot with Will Smith and the robot that saved him instead of saving the little girl. It was all based on probabilities. Numbers can be so cold.

0
0
0.000
avatar

Yeah! I watched that movie recently and to be honest, I think the robot made the right choice too. Because if Will had died, only goodness knows what would have happened with the robots in the end as he played a major role in saving the world from the control of AI.

I just think it is neither a right nor wrong choice as long as it wasn't within its control to save both.

@hopestylist here!

0
0
0.000
avatar

Right! Then it would have been the whole world getting screwed not just a little girl dying.

0
0
0.000
avatar

Exactly! And that wouldn’t have been something anyone would have wanted.

It’s sad the little girl died and that wasn’t the robot’s fault. If only she had more chances of survival the robot wouldn’t have hesitated to save both of them. It was beyond its capabilities.

0
0
0.000
avatar

They can be. But what else is there to go on?

0
0
0.000
avatar

Yeah, just look at insurance actuaries or whatever they are called.

0
0
0.000
avatar

I am sure you have heard about the Kitty Genovese story, the most used event to describe the "bystander effect".

https://www.gothamcenter.org/blog/a-new-york-story-kitty-genovese

https://pmc.ncbi.nlm.nih.gov/articles/PMC8692770/

https://en.wikipedia.org/wiki/Murder_of_Kitty_Genovese

The horrific nature of the event is perhaps more relevant today.

0
0
0.000
avatar

Yep, I remember reading about that at university. A terrible case. It's crazy to me just how often the bystander effect is shown to be a thing.

0
0
0.000
avatar

hmmmmmmm.. some hard choices. I would think it'd be an easier choice is the 1 was a kid and the 5 were adults.. still. making a choice would be hard..

0
0
0.000
avatar

When it comes to something that is beyond my control, I just learn to live with the consequences. I’m very sure anyone will ever be happy that someone dies in such situation but since someone has to die anyways, no choice is right or wrong.

!PIMP

@hopestylist here!

0
0
0.000
avatar

Living with the consequences would be the real hard part.

0
0
0.000
avatar

Of course. Facing consequences is something we will always have to do. Whether or not it's our faults. That guilt is there to bring us down and the sooner we forgive ourselves, the better for everyone because there was little or nothing we could have done.

Anyways, we just hope that we don't find ourselves in such difficult situations.

0
0
0.000
avatar

What came first, Morality or Trolleys?

What was the Trolley problem like, before trains were invented? That's what I want to know. 😋

0
0
0.000