Tuesday, January 2, 2024

Rationality Bros, Expected Value (EV) Calculations, and The Real Woo-Woo

There’s a certain type of person who considers it “objective rationality” if they can enter measurable quantities into a formula and just “vibes” or even “stupidity” when decisions are made using other methods. The implication is that we’re out here, doing woo-woo and getting the world into trouble, while they’re in there, heroically calculating.

But in reality, even the most rationality-lover’s rationality theory says you should base decisions on what you want and care about. Those  “measurable quantities” is where the real woo-woo is actually happening.

My most recent encounter with magical rationality thinking was reading about Sam Bankman-Fried in Zeke Faux’s cryptomania romp Number Go Up. Like a lot of people, I got obsessed with SBF while he was in the news. It had never occurred to me that I would see the ins and outs of utilitarian ethical theory debated in the popular press, but right there in the middle the trial, his ex-girlfriend said “he believed that the ways that people tried to justify rules like ‘don’t lie’ and ‘don’t steal’ within utilitarianism didn’t work.” 

Ha, because me too! But I decided utilitarian ethical theory is wrong, while I guess SBF went more in the pro- lying and stealing direction.

To extract more entertainment from the trial, I read Number Go Up. Toward the end, Faux spends some quality time with SBF and they talk about decision-making. SBF invokes the concept of “Expected Value” or “EV”. EV is like a weighted average: in a bet, it reflects the average amount you’d expect to win or lose if you played the same game over and over. SBF likes to make decisions that are EV+ — that is, the expected value is positive.

There are contexts where this is straightforward. If you’re trying to decide whether to take a bet where you put up a dollar and 50-50 odds you lose and 50-50 odds you win three dollars, the expected value for the bet is (-1)(.5)+3(.5) = +1 dollar. On average, you’ll gain a dollar each time you play. Since this is positive, it’s a good bet. If your only goal is to maximize your dollars, you should go ahead and take it.

But there are difficulties applying EV to life, because the V in EV — “value” — represents outcomes, and is not the same as the U in utility, which represents how desirable or good outcomes are. And it’s the U you need when you’re trying to make decisions. That’s why the standard decision-theory concept is called “expected UTILITY theory.”  

U and V are not the same, because how good things are is a judgment call and varies from one situation to another. Suppose you have a million dollars and someone offers you a bet with 50-50 odds where if you win, you get two million more dollars, but if you lose, you lose all you have. If the “V” is measured in money, this bet is strongly EV+: in the average weighted outcome, you are up (-1,000,000)*.5+(2,00,000)*.5= $500,000.

But utility-wise, taking the bet could be wildly irrational. If that million is all your money, and it’s earmarked for your family’s well-being, you’d care way more about keeping it than about failing to gain more. Your personal utility of losing $1,000,000 is that it is destitution and disaster, something you’d do anything to avoid. Your utility of failing to gain another $2,000,000 is that it is too bad not to have more, but not even comparable to the badness of the difference between having your $1,000,000 and having nothing. If your U is what matters, it’s a terrible bet.

You don’t have to take my word for it. “The Expected Utility Hypothesis is that rational agents maximize utility, meaning the subjective desirability of their actions.” Desirability — i. e., how much you want or care about a thing. Bernoulli argued in 1738 that different bets are rational for rich people and poor people. 

At one point, SBV tells Faux that when starting up his business, he estimated an 80 percent chance of financial failure. But the magnitude of the potential financial gain was so great,  he said, that even with the smallish 20 percent chance of success, the decision overall was “EV positive.” He describes this perspective as “risk neutral,” which I guess means each dollar counts for the same amount, whether it is lost or gained.

Faux says that to a normal person, such calculations lead to decisions that “normal people” would find “insane.” “Normal people” find these decisions make no sense because most people do not value their millionth dollar as much as their first — the utility money brings varies from person to person and depends how much you have. SBF says the calculations make sense within “effective altruism” — that is, he is considering not his own personal utility but rather what utility it would hypothetically bring if he hypothetically gave it away to a cause that would hypothetically make the world a better place proportionally to the hypothetical dollar amount. There may be something to that, depending on how you interpret all those hypotheticals. But it doesn’t mean you’re being objective or rational if you just count up the dollars and ignore why the dollars matter.

Later, it gets weirder: SBF is quoted as saying he’d take a bet in which “51 percent you double the earth out somewhere else, forty-nine percent it all disappears.” Now instead of maximizing “dollars” you’re maximizing “earths.” Using standard decision theory, that is a rational perspective only if you have roughly as much positive feeling about having an extra earth as you do negative feelings about having no earths. How is that state of mind possible? I’m not even sure I want two earths.

My point is that rational decision making always rests on evaluation, either of your subjective utility or, if you’re someone who believes in objective value, of your assessment of that value. Yes, you can try to quantify your evaluations, and then you can use EV (or, rather, EU) theory to hone your decision-making if you are good at that kind of quantification. But putting in numbers that correspond to the desirability of the outcomes is the hard part. Just mapping the numbers over — “two earths is twice as good as one earth” — ignores the most important pieces of information. With respect to money, to say that your millionth dollar is as important as your first requires assuming that money has some fixed inherent value and thus denying basic economic principles like the law of diminishing marginal utility.

In a scientific approach to the world, it’s people who value things. Our valuing of them determines their worth. This is how all of contemporary economics operates. So acting like you can just count up the obvious numbers, and set aside your personal evaluations — that is magical rationality thinking and the real woo-woo. Underneath any calculation, it’s utility — desirability and goodness — that makes decisions rational or not.

Of course, people evaluate the desirability of outcomes in different ways, which makes different decisions rational from different points of view. This is one reason social and public decision-making is so fraught and complicated and living together is hard. But we’re not going to make it easier by being more hard-headed about numbers — especially if the numbers themselves don’t make any sense.

No comments: