A Short Piece

On Justification and Consequences to Others

O.G. Rose
5 min readJan 30, 2021

A Reflection on Evidence Relative to Contained Versus Uncontained Risk

Photo by Markus Spiske

As considered in “The Conflict of Mind” by O.G. Rose, the amount of justification an argument needs to be accepted should be relative to the degree that the consequences of the argument are contained and individuated versus uncontained and nonindividuated. There are “nonindividuated consequences” — consequences that I suffer because of the choices of others — and “individuated consequences” — consequences that I suffer because of my own choices (we could also say “contained consequences” versus “uncontained consequences”).

If I am arguing, “I should take a walk to the mailbox,” I am the only one who will face consequences for my choice, and thus the amount of evidence I need to justify my position can be much lower than if I argue, “We should take a walk to the mailbox.” That’s not to say the bar for the second claim must be incredibly high or unreachable, but it will be higher since it affects you and it wasn’t your idea that we take a walk. To justify taking a walk to the mailbox by myself, for example, maybe I need one good reason, while I need three good reasons for us both to do it (assuming you need to be convinced to take the walk).

Perhaps it could be argued that if I take a walk alone that I deny you time with me, but this consequence to you is very small and only a problem if you “choose” to be hurt by it. Thus, the consequences of my choice are contained unless you “un-contain” them, which is your choice and responsibility, not something that should change my standards of justification.

(Admittedly, this short reflection does not completely solve “the problem of justification” that is outlined in “The Conflict of Mind” by O.G. Rose, for we still must ask why three pieces of evidence should be needed to justify x versus four, and even if we can say “x needs less justification than y,” we still have to determine how much. However, I do think the reflection can at least help decision making, so we will proceed.)

If I am deciding, “I should buy x payment plan,” the justification needed for this position is much less than if I argue, “Everyone should buy x payment plan.” This doesn’t mean the argument can’t be made, but it means the standards of justification are different (though by how much exactly isn’t easy to say). Additionally, I believe that evidence in favor of say Libertarianism is less than the evidence needed to justify Socialism (even if Socialism is superior), because Libertarianism basically wants to leave people alone (though this is a crude simplification), while Socialism basically wants to optimize humanity by not leaving people alone and helping them (and again, perhaps Socialism is superior — we should not conflate “leaving alone” with “necessarily better,” even if a different standard of justification is appropriate).

Some may argue that the fact x argument seeks to help people while y only helps an individual means it is actually x that requires less evidence than y, but this first assumes that “the help is actually helpful,” which is always something I’m very hesitant to assume (admittedly influenced by Thomas Sowell). Additionally, if x impacts many people, if there are unintended consequences, those consequences will be much direr (a reason to take Leopold Kohr seriously). I do not believe standards of justification should be determined relative to “intentions” but more so relative to “consequences” and if those consequences are contained or uncontained (though this isn’t to say it’s always easy to determine such). (However, this doesn’t necessarily lead into Utilitarianism, as discussed in “The Value Isn’t the Utility” by O.G. Rose.)

An advantage of individuated positions are that damages and unintended consequences are better contained, while nonindividuated positions create more good if they work out but more trouble if they don’t. Personally, I believe this means that while an individuated position may need three pieces of evidence to justify its premise, a nonindividuated position will need say six (of which will provide reason to believe that there will not be unintended consequences and/or severe unintended consequences).

But what if a nonindividuated position will contribute to the preservation of injustice? Shouldn’t the fact that x will preserve injustice while y won’t mean that y should get the lower standard of justification? No, but if we can prove that y will in fact increase justice, then y should indeed be done while x is not. Do note that saying y (a nonindividuated position) requires higher justification does not mean y should never be done, and indeed, if y can justify itself and provide strong reason to believe that “it’s help is in fact helpful,” then y should be realized.

That said, I do think the standard of justification must remain high on nonindividuated positions, because history shows many examples of people who truly believed that their efforts would make the world a better place and decrease injustice, only to find out later that their efforts did the exact opposite.

I believe that if “good intentions” are used as reasons for lowering standards of justification, then more tragic episodes with unintended consequences will happen more frequently. However, if standards of justification are kept high relative to nonindividuated positions, then the likelihood that policies that are supposed to help in fact helping will be higher. At the same time, the likelihood of severe unintended consequences will be lower. Yes, this leaves open the question of “how high” the bar of justification must be for “non-contained” politics, positions, etc., but that is a question that must be left for The True Isn’t the Rational.




For more, please visit O.G. Rose.com. Also, please subscribe to our YouTube channel and follow us on Instagram and Facebook.



O.G. Rose

Iowa. Broken Pencil. Allegory. Write Launch. Ponder. Pidgeonholes. W&M. Poydras. Toho. ellipsis. O:JA&L. West Trade. UNO. Pushcart. https://linktr.ee/ogrose