A Short Piece

Rationality Is Mostly About Making Good Bets

O.G. Rose
5 min readFeb 1, 2021

On Logic, Probability, and Scope

Photo by Dylan Clifton

If A is B, and B is C, is C equal to A? Yes, that would be a rational conclusion. Now try this one: if A is B 20% of the time, and B is C 15% of the time, what percentage of the time is C equal to A? That’s a lot trickier, isn’t it? (5% sounds right, no?) Well too bad we live mostly in a world of probabilities, though by how rationality and logic are often discussed, it’s suggested we live in a world composed mostly of basic syllogisms.

What it means to be rational is talked about like it’s simple and clear-cut, but I think that’s caused a lot of trouble. If I think it’s going to rain today and I bring an umbrella, but then it doesn’t rain, did I act irrational? No, but I was wrong: turns out it’s possible to be both rational and wrong, even though the terms “rational” and “wrong” are often conflated. Similarly, if there is a 95% chance of event x happening and I bet against x and end up right, did I act rationally? Well, generally, if something is only 49% likely to happen, let alone 5%, I’m irrational to bet in favor of it, but that doesn’t mean the bet won’t end up in my favor.

People talk as if there are two camps — irrational people and rational people — and act like problems in life have solutions, and that if we can’t figure those solutions out, it’s because we’re not thinking hard enough. But often problems in life are trade-offs and bets for one outcome versus another: we don’t live in a world of syllogisms but probabilities. To allude to another paper of mine, this isn’t a world of “A is A’ but of “ ‘A/(A-isn’t-A)’ is ‘A/(A-isn’t-A)’ (without B),” and metaphysical schemas seem to always entail logical consequences.

People talk as if there are two camps — irrational people and rational people — and act like problems in life have solutions, and that if we can’t figure those solutions out, it’s because we’re not thinking hard enough. But often problems in life are trade-offs and bets for one outcome versus another: we don’t live in a world of syllogisms but probabilities. To allude to another paper of mine, this isn’t a world of “A is A’ but of “ ‘A/(A-isn’t-A)’ is ‘A/(A-isn’t-A)’ (without B),” and metaphysical schemas seem to always entail logical consequences.

As Lorenzo Canonico notes, beliefs should be treated like an investment portfolio. In his work “Is Investing the Best Model to Deal with Uncertainty?” Lorenzo writes:

I consider myself a fallibilist, because I believe truth exists even if we may never be certain of it. Specifically, I think most if not all of what I know will eventually be disproven or proven inadequate to explain major phenomena. I base this belief not just on the philosophical arguments of people like Karl Popper, Chuck Klosterman, and Annie Duke, but also on the observation of 2 trends: a) history is filled with paradigm shifts and b) our complex world is full of black swans.

Think about it this way: if you had to bet money on each aspect of your worldview, would your portfolio eventually go bust? Eventually, probably yea, since on a long enough time line paradigm shifts and black swan undermine almost every plan.

Under “financial epistemology”, our worldview becomes a portfolio: a collection of bets on what will be disproven and when in order to achieve some material advantage. Even though we know that our investments can and eventually will fail, the key is to make our “trades” at just the right time (which is impossible) to maximize our goals.

Something similar could apply with rationality: to the degree it is rational to do or think x is relative to probability and costs/benefits: it is not simply a matter of “if it’s true,” because that’s not always determinable and many problems aren’t so straightforward. Now, that said, if x can be determined to be true, then it should be believed regardless the cost/benefits, risk/rewards: please note that we are approaching the question of rationality in this circumstance relative to uncertainty (this is not pure pragmaticism). To the degree uncertainty isn’t present is to the degree we don’t have to approach “rationality as making a good bet,” but if we believe certainty is mostly impossible, then this role of rationality will almost always be present to some degree.

Rationality and probability are mostly indivisible, as it’s similarly impossible for rational activity to occur outside a concert of other rational agents making decisions that can change what is rational for us (John Nash’s great insight). And yet we continue to teach or discuss rationality as if it’s the ability to discern what’s right and wrong, true and false. In a universe of syllogisms (where “A is A” was the only metaphysics, per se), this would follow, and certainly there are syllogistic dimensions of the universe, but mostly we live with probabilities.

Considering all this, to use a thought from Kennan Grant, rationality encourages a bias towards the individual more than individualism encourages a bias toward rationality — this is often gotten backwards. You can make better bets regarding ourself versus regarding things beyond us, for considering Friedrich Hayek, there are too many 2nd, 3rd, 4th, etc. order effects which are too complex to be well-analyzed and therefore too complicated to be compellingly articulated in an exclusively rationalist framework. If we want to make good bets, it’s rational to zoom in on the individual because the individual is more comprehensible (though not perfectly).

In light of this, perhaps the emphasis on individualism in Conservatism comes from its commitment to effective, more-likely-to-be-right rationalism, versus a commitment to individualism for the sake of “self-centered” individualism. And perhaps this suggests there is little status to be won by being a Hayekian Conservative: it is a worldview that suggests we rarely can make good bets beyond ourselves, which makes us out to be helpless and self-centered out of epistemological necessity. This seems regressive and the opposite of progress, which feels like it must go outward, away from where we are to somewhere else. This isn’t to say Conservatism is right, but it is to say that an understanding of rationality that is based on probability tends to lean toward smaller systems, while a rationality based on theory can favor larger systems. The world needs theory, certainly, but theory that fails to be probabilistic is not theory I would bet on.

In conclusion, we shouldn’t judge the quality of a decision by its consequences but by the information available to it at the time of the decision (and the smaller the system, the higher the likelihood the data is reliable). This is because we are dealing with bets, not guarantees, and luck can prevail when faced with impossible odds. Rationality is mostly probabilistic; syllogistic rationality leaves the wrong impression. Rationality is about making good bets, and if we’re always right, we’re probably not being rational.

.

.

.

For more, please visit O.G. Rose.com. Also, please subscribe to our YouTube channel and follow us on Instagram and Facebook.

--

--

O.G. Rose

Iowa. Broken Pencil. Allegory. Write Launch. Ponder. Pidgeonholes. W&M. Poydras. Toho. ellipsis. O:JA&L. West Trade. UNO. Pushcart. https://linktr.ee/ogrose