Firstly, the whole thing relies on putting complete faith in destiny (five of the people *must* become inventors, the other five *must* become mass murderers), which I'm not so sure about. This story requires from me suspension of disbelief, but I'll do my best.
The relative probably doesn't hold any greater value than an single average stranger. I might have a greater emotional attachment to them, but it would be very self-centred to make my decision based on that. If I want to make the best moral decision, I have to cast aside my own personal attachment to all the people involved.
I don't believe in the importance of individual revolutionary people to the greater scheme of things. Especially in the modern world, there's a limit to how much a lone genius can accomplish. You need a large team of people to accomplish anything truly revolutionary, and the thing about large teams is that one member of them is dispensable without impacting on whole. So the revolutionary inventors aren't really a lot more important than the average person in terms of contributing to the common good (and that's assuming technological progress even constitutes the common good in the first place), although killing them is a significant incidence of suffering in itself.
The mass murderers are also only a little bit important to the common suffering. A mass murder features, by the FBI's definition, four or more people murdered. If there's five mass murderers they'll definitely kill at least 20 people put together. 20 lives isn't many out of over 7 billion people on Earth, but it is still suffering.
You can write it as an equation. Minus 10 lives plus greater than 20 lives plus 1 life for the relative you decided not to kill minus 'x' technology. -10L + >20L + 1L - xT = >11L - xT. That means saving at least 11 lives at the cost of an unknown amount of technological progress. So the moral question, when you unload it from personal emotional attachment, is this: what do you value more, 11 or more human lives or an unknown (but probably small) amount of technological progress? That's a difficult question, because you can't objectively compare technology to human lives. They're fundamentally different things. But I think I think those human lives are more valuable. So I'll run over the 10 people, please.
If inventors whose lives I'm considering ending were indispensable to the project, and I knew all that to start with, then the good utilitarian decision would probably be to save the inventors and the murderers. That means I would have to kill my relative, which sucks for me but is great for the good of humanity.
Would you rather... [moral dilemma ver.]
← View full post
There are a lot of strands to this.
Firstly, the whole thing relies on putting complete faith in destiny (five of the people *must* become inventors, the other five *must* become mass murderers), which I'm not so sure about. This story requires from me suspension of disbelief, but I'll do my best.
The relative probably doesn't hold any greater value than an single average stranger. I might have a greater emotional attachment to them, but it would be very self-centred to make my decision based on that. If I want to make the best moral decision, I have to cast aside my own personal attachment to all the people involved.
I don't believe in the importance of individual revolutionary people to the greater scheme of things. Especially in the modern world, there's a limit to how much a lone genius can accomplish. You need a large team of people to accomplish anything truly revolutionary, and the thing about large teams is that one member of them is dispensable without impacting on whole. So the revolutionary inventors aren't really a lot more important than the average person in terms of contributing to the common good (and that's assuming technological progress even constitutes the common good in the first place), although killing them is a significant incidence of suffering in itself.
The mass murderers are also only a little bit important to the common suffering. A mass murder features, by the FBI's definition, four or more people murdered. If there's five mass murderers they'll definitely kill at least 20 people put together. 20 lives isn't many out of over 7 billion people on Earth, but it is still suffering.
You can write it as an equation. Minus 10 lives plus greater than 20 lives plus 1 life for the relative you decided not to kill minus 'x' technology. -10L + >20L + 1L - xT = >11L - xT. That means saving at least 11 lives at the cost of an unknown amount of technological progress. So the moral question, when you unload it from personal emotional attachment, is this: what do you value more, 11 or more human lives or an unknown (but probably small) amount of technological progress? That's a difficult question, because you can't objectively compare technology to human lives. They're fundamentally different things. But I think I think those human lives are more valuable. So I'll run over the 10 people, please.
--
Anonymous Post Author
9 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
This is some pretty fair logic, too. What if the inventors came up with technology that later saved millions of lives, though?
Just food for thought.
--
dom180
9 years ago
|
pl
Comment Hidden (
show
)
Report
0
0
If inventors whose lives I'm considering ending were indispensable to the project, and I knew all that to start with, then the good utilitarian decision would probably be to save the inventors and the murderers. That means I would have to kill my relative, which sucks for me but is great for the good of humanity.