Yes, you should save 10^100 shrimp instead of one human
Unless you're irrational or evil or both
Please free your mind of bias for a moment; do not think about what I will eventually argue, but please read what I say with your sole focus being the correctness of what I say. I will tell you once you can be irrational again.
Here are two thought experiments:
Candy: You really want some skittles, and perhaps a soda. You can get these by driving to the convenience store a few miles away.
Shrimp Savior: For whatever reason, you can save 1010 shrimp from 10 minutes of unbearable torture. All you have to do is sacrifice a bag of skittles, and perhaps a soda.
Merriam-Webster defines torture as “the infliction of intense pain (as from burning, crushing, or wounding) to punish, coerce, or afford sadistic pleasure.” So I remind the reader that it is part of the second thought experiment that the shrimp are sentient.
Philosophy is hard, but I will hazard a conjecture about the correct answers in our thought experiments. I conjecture that in Candy, either choice is permissible—in particular, it is permissible to go drive to the store and get some skittles, and perhaps a soda. I conjecture also that in Shrimp Savior, it is obligatory to sacrifice your skittles to save the shrimp. If you disagree with the latter, then you are radically evil.
Now, this is getting super complicated, but consider a third thought experiment:
Shrimp Savior 2: The Shrimple Choice: 1010 shrimp are going to be unbearably tortured for ten minutes, but there’s a button a few miles away that will prevent this. You can drive over to the button and press it.
I know, I know, this is so hard, but after months of deliberation, I have concluded that it is obligatory to take a nice little drive and press the button in Shrimp Savior 2. Indeed, our verdicts in Candy and Shrimp Savior provide a strong presumptive case in favor of this third verdict. In Shrimp Savior, is the verdict only plausible because we don’t assume you drove to the store to get the skittles? No. I posit that even if those skittles result from a drive to the store, you should nevertheless sacrifice them to save 1010 shrimp from 10 minutes of unbearable torture each. So—and let us be very careful here—it appears we can even go so far as to say that it would be obligatory to drive to the store to get some skittles to sacrifice for the sake of the shrimp in Shrimp Savior. But if you should even drive to the store to get yummy skittles to sacrifice, how much clearer is it that you should drive a few miles just to press a button? We can cut out the middleman. Therefore, I claim, you are obligated to drive a few miles to press the button in Shrimp Savior 2.
Okay, just one more case until I give you permission to be irrational again. Bear with me here.
Shrimp Savior 3: The Shrimpossible Dilemma: You are the sole immortal being in a strange world. This is a world that will exist until the end of time, with largely the same political situation, economic and technological development, etc. as the United States in 2010. Everyone comes into and existence and lives a normal life for a normal amount of time, but you are immortal for some reason.
As happens once every eon or so, 1010 shrimp are going to be unbearably tortured for ten minutes, but there’s a button a few miles away that will prevent this. You can drive over to the button and press it.
I wager that the strange circumstances I added do not affect the verdict. The choice at hand is the same. Sure, sometimes external circumstances can affect a choice. But this doesn’t seem like one of those cases. So, I say, you are obligated to save the shrimp, and if you say otherwise, you are radically evil and you should be given the death penalty. If you disagree, we can run the same argument as the one for Shrimp Savior 2. I wager that if you were the immortal human from Shrimp Savior 3, you would occasionally drive a few miles to the store to get some skittles, and you would be obligated to sacrifice those skittles to save 1010 shrimp if faced with the choice, and so we should cut out the middleman and say you’re obligated to drive to the button and press it.
Okay, you can be irrational again. I mean, you shouldn’t, but I doubt you listened to me the first time.
People say they value human life infinitely more than the lower goods, like skittles and not getting tortured. Nobody believes this. They say this because saying “Saving a life only has a finite amount of good” makes them feel icky and utilitarian and saying “A human life is infinite in value” makes them feel principled and noble. They say this because they do not know the difference between reasons that establish something to be true and reasons that make it feel good to say something.
Why do I say this? Because these people drive to the store to get skittles. According to my calculations, conservatively, one person in the US dies per ~108 miles driven. But math is hard, and maybe you’re a super good driver like everyone else, and maybe my empirical reasoning is wrong for some other reason, so let’s assume I’m off by a factor of a trillion. So, we may assume, deciding to drive one mile means incurring a 1 in 1020 chance of someone dying when they otherwise would not have. Sure, you’re not a murderer if someone ends up dying—you reasonably believed the chance was minuscule. And the person decided to accept the risk by getting on the road. Well, no. Sometimes people in cars are babies. I don’t know how many people in cars are babies, so let’s say 1 in 1010. So for each mile you drive we can say there’s a 1 in 1030 chance a baby dies when they otherwise wouldn’t have. And some of those babies would have gone on to live full lives! I don’t know how many, so let’s say 1 in 1010. Let’s also throw in another factor of ten billion for fun. So, when you drive a mile, you incur a risk of at least 1 in 1050 of killing someone who didn’t choose to be in this situation and who would have otherwise lived a full life. Therefore, unless you value skittles infinitely, if you’d drive a mile to get some skittles, you do not value human life infinitely. You do not value more than 1050 times skittles. Now, 1050 is a really big number, so this is hardly any defect of your character. But 1050 is, famously, less than infinity.
A potential objection: “I value human life infinitely because I would never choose a sure chance of someone dying over some merely finite good. This does not by itself commit me to a claim about what to do in the cases where the chance of someone dying is minuscule.” There is a decisive response to this objection that I have put you in a very good position to discern. I’m afraid some harsh words are in order: if you do not anticipate the reply I’m about to give (or at least something vaguely similar to it), you have no hope of being able to engage in ethical reasoning in cases that are actually hard. And, honestly, if you want to be good at ethical reasoning, then the objection I am replying to should have never occurred come in your head without being immediately dismissed. If you want to be good at ethics, many many thoughts more difficult than what I am about to explain will have automatically occur to you without having Flo there to explain them.
So: I made a claim about probabilities, but what about cases where a death is certain? Well, that is precisely the case in our case above in Shrimp Savior 3: The Shrimpossible Dilemma and the analogous case where you decide to drive a few miles to get skittles. If you go for the drive to get skittles once a week, or once a year, or once an eon, after 1051 such periods, it is almost certain someone will die, someone who didn’t choose to be in a car and would have otherwise lived a full life, all for 1050 bags of skittles. And if on each individual occasion you’d sacrifice your skittles to save 1010 shrimp from torture, that means you’d do something that leads to an almost certain death (though not in a way that amounts to killing someone, just a tragic accident) that wouldn’t have otherwise happened, all to save 1060 shrimp. If my math wasn’t sufficiently conservative, or if we just want to have more fun, then let’s throw in another factor of 1040.
“But that’s not one decision which leads to a death. It’s a bunch of individually justifiable decisions which collectively lead to a certain death.” Does your decision to be the kind of person who occasionally drives to the store for some skittles count as one decision? Would it be any different if, one day, the immortal man programmed his robot butler to drive to the store and get some skittles every now and then? Come on.
I posted this dilemma on X (formerly Twitter). Most people failed miserably—and not because they thought about the matter dispassionately and came to some counterintuitive but defensible conclusions about aggregation. You, dear average detractor, do not say you’d save a human instead of 10100 shrimp from torture because you’ve thought about the cost you attach to saving the life of a human, seen what your actual actions commit you to with regard to the former’s value relative to merely finite goods, and wound up getting something bigger than 10100 as your answer—the sort of thing that would happen if you cared about being right. You say that because not saying it would make you feel icky. Unless you’re a remarkable ascetic who doesn’t make short drives for trivial reasons, you’ve already put a price on a life far lower than 10100 shrimp.
If you think I’m presupposing utilitarianism or utilitarianism lite, read this post again and actually think this time.
It’s fairly obvious that most people’s goals in responding to these questions is wanting to avoid icky words and not one of being right. They hear “Save a person or big number; caring about big number at the expense of a person is bad; therefore save the person,” and there is little sign of anything more sophistocated happening. They do not think any thoughts in response to the question that they would also think if one replaced 10100 with 100. This is why people refuse to answer thought experiments, and why people refuse to engage in the reasoning laid out in this post without derailing the conversation with vague ethical claims that feel nice to say. It’s not because people are incapable of hypothetical reasoning; it’s because hypothetical reasoning is not a good tool for what they actually want.
It may seem weird that I care about this matter; there will never be a case where 10100 shrimps’ happiness is on the line in this manner. Do I need to remind you why it’s important that people say things that make them feel nice without regard for ethical truth, and how much blood is on the hands of those operate like this?
You write like somebody who actively dislikes and looks down on their audience. It's not particularly pleasant and you should stop
It's extremely annoying to pre-emptively insult your audience (before anyone has even stated any objections) and imply that irrationality is the only reason anyone could possibly disagree with you.