AN ESSAY FEATURED IN THE MAP IS INDESTRUCTIBLE BY O.G. ROSE

Truths, Falsities, and Blurs

O.G. Rose
18 min readOct 5, 2021

Internally Consistent Systems, Suskind Donkeys, and A Needed Third Category

Photo by Gabriel Santiag

We tend to think of informative statements as either truths or lies, but what about something we can’t identify as either a truth or a lie? What about something that makes identifying “what is the case” harder, that blurs the line between “truth” and “falsity” beyond recognition and/or that convinces us that we cannot recognize the difference? That doesn’t seem to be a “falsity,” and yet right now that is the only term we seem to have at our disposal, suggesting the need for something else. In this short paper, a term I would suggest is “blur,” and such a term is especially needed in our Internet Age.

Some characteristics of a “blur”:

1. Is not clearly “true” or “false.”

2. Makes the difference between “truth” and “falsity” (more) difficult to determine.

3. If the validity or falsity of the claim can be determined, it cannot be so accepted or denied except from a position of expertise, authority, radical investment, etc. Relative to the people in the discussion, the premise is thus theoretically falsifiable but not practically.

Audio Summary

A “blur” is not merely “between truth and falsity,” for something “between truth and falsity” is false. It is “indeterminable” at the time, but might not be “ultimately indeterminable,” hence why I wanted to avoid the term “indeterminable.” Yes, there is something about a “blur” which is indeed “indeterminable,” but it is also not clearly “indeterminable” when it is (first) uttered; in fact, “blurs” can seem very determinable and plausible (only for it later to be determined that this isn’t the case). But practically “blurs” are not (or at least we have every reason to think they are not), and it is this gap between “the theoretically possible” and “the practically impossible” that “blurs” mostly occupy. Unfortunately, it is by residing in this gap that “blurs” are so powerful, problematic, and easy to be used in service of manipulation by politicians, corporations, cults, and us.

I

It’s mistaken to call a “blur” a “lie,” for the word “lie” implies intention and maliciousness, which doesn’t have to be the case with a “blur” — speakers might genuinely believe they are making valid points, and people who manipulate us may themselves be manipulated by their ideas and not even realize they are manipulators. The word “lie” suggests too much intention, whereas “blur” suggests that the intentions behind the “blurs” are just as hazy and “indeterminable” as the ideas themselves. It also seems to be a mistake to call a “blur” a “falsity,” for we can’t tell if the premise is true or false. But words like “lie” and “falsity” are generally all we have “at hand” today, which is especially problematic in our Internet Age of infinite information (and countless “Pynchon Risks,” as will be explained). We currently stuck in “the dichotomy of truth and falsity,” and the dichotomy is not only inadequate but dangerous. We need Derrida.

When a politician makes numerous accusations that we can’t prove or disprove, what do we do? This is a tough question, especially if the politician has framed us as “having no right to advance” in the conversation, in our thinking, etc. until we prove or disprove the accusations. This problem is discussed in “The Conflict of Mind” by O.G. Rose, and horrifically even if we could “figure out” a few of the accusations, the politician could two minutes later just throw out a hundred more. With every step up the mountain, we could slide three steps back, as the politician perhaps knows — it could easily be a corrupt method for maintaining power. Moving forward, please note that what is argued about “politicians” could just as easily apply to corporations, special interest groups, authorities, friends — this is not a problem that exists exclusively in the political realm, though perhaps that is the realm where the problem is most “vivid.”

To the politicians making countless “unfalsifiable accusations,” what if we could say, “That’s a blur,” and instantly stop the politicians in their tracks? What if there was a “third category” of “sense-making” (which the society at large also understood), of assessing “the weight” of a claim, that did not lure us into “a rabbit hole” out of which we might never escape? The works of O.G. Rose call this “rabbit hole” a “Pynchon Risk,” a term which is defined in “The Conflict of Mind,” as discussed these sections:

What is “The Pynchon Risk” (a notion inspired by The Crying of Lot 49 by Thomas Pynchon)? […] [I]t is “to investigate if x is true when there is no guaranteed point in which we’ll be able to say for sure that x is true/false.” For example, if we begin investigating the question “Does God Exist?” there is no guarantee that if we read every theology book, every religious text, every New Age Atheist book against religion, and study the question for eighty years, that in the end, we’ll be able to say for sure whether or not God Exists. To investigate questions about God’s Existence and God’s Identity is to take a “Pynchon Risk.”

A good image for understanding “Pynchon Risks” could be found in Made in Abyss by Akihito Tsukushi. In the series, there is an abyss that if you enter, there is a “curse” that makes it hard to come out. At a certain point of depth, it’s nearly impossible to climb out without either losing your life or your humanity. Certain characters enter the abyss in hopes of adventure and finding certain goals, and if they accomplish their dreams, then perhaps entering the abyss was worth it (seeing as they might never be able to return home). But if they don’t, not only does the quest prove to be a waste, but now they are stuck, and so the deeper they travel, knowing they can’t go back, perhaps the more desperate they can become to find evidence that the journey was worth it, that in the end, it will all add up. The only way characters could have ever known if entering the abyss would be worth it was by entering the abyss, and standing outside of the abyss at the top, before any journey ever started, longing, curiosity, and perhaps even epistemic responsibility ate at them.

If a politician says, “The media is controlled by Conservative,” how could we verify this claim? But if we replied, “That’s false,” we wouldn’t be “epistemologically justified” to make this claim, for we haven’t investigated the claim to conclude “it’s false.” Have we invested “the corporate media” structure like Noam Comisky ourselves? And even if we had, are we sure we’re right? Maybe we should look into the case for another decade — wouldn’t that be responsible? And so on — as hopefully this example makes clear, “the dichotomy of true and false” is inadequate for the task. Something more is needed.

Stuck with only the categories of “true” and “false,” all a politician must do (to control us, to maintain self-deception, etc.) is keep making claims that would require the research of a Ph.D. student to figure out, and we’re stuck. We can’t say the claims are outright false, but we also can’t say they’re outright true, but those seem to be the only options presently available to us. What can we do? Well, currently, it seems all we can do is start our research, during which the politician likely moves on to other things. The topic of “reality-based community” comes to mind here, coined by Ron Suskind, who wrote on what an aide of the Bush administration told him:

The aide said that guys like me were ‘in what we call the reality-based community,’ which he defined as people who ‘believe that solutions emerge from your judicious study of discernible reality.’ […] ‘That’s not the way the world really works anymore,’ he continued. ‘We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality — judiciously, as you will — we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors…and you, all of you, will be left to just study what we do.’

It’s not exactly the same, but Suskind’s idea is that leaders can change reality quicker than we can study it, which means that by the time we finish our “first round of studying” to justify our position, worldview, etc., we find ourselves having to study something else, then something else, then something else — paralyzed like “Burdian’s Donkey,” but this time it’s “Suskind’s Donkey” (which can also be associated with “The Red Queen Effect”). And leaders may very well know this, and so, to maintain power, keep “changing reality” quicker than we can catch them, and just when we’re about to catch them, they hurl another “blur” at us that we don’t have the language to call “a blur.” And so we’re trapped in “the dichotomy of true and false,” unable to ever catch up (like how Zeno’s Achille can never catch the tortoise).

We desperately need a new category of assessment beyond “the dichotomy of true and false” if there is any hope of escaping our age of endless conspiracies, fringe movements, manipulative politicians, epistemic responsibility being used against us, “Suskind donkeys,” and the like. Right now, our language keeps us stuck in a “dichotomy of true and false,” which is proving radically inadequate for our current age. Derrida warned us about the dangers of being “stuck in a dichotomy” and need for deconstruction, and “the dichotomy of true and false” has proven particularly problematic for our “Internet Age.” Why? Well, again, as discussed throughout The True Isn’t the Rational by O.G. Rose, we find ourselves today presented with countless “internally consistent systems” that we cannot “on their face” determine if they are valid or not. Armed with only the language of “true” and “false,” we find ourselves having to investigate every claim that is presented to us, lest we otherwise be “epistemically irresponsible.” And so we find ourselves having to take countless “Pynchon Risks,” as those in power may know and use to their advantage. Alluding to “Deconstructing Common Life” by O.G. Rose, perhaps a role of the “common life philosopher” today is to defend people from manipulation by introducing and using the category of “blurs?” Hard to say, but certainly philosophers today need to be aware of “Pynchon Risks” and the danger they poise.

II

A “Pynchon Risk” makes “knowing what to believe” much harder if not impossible, and by extension it can make it more difficult to trust and believe in anything. The existential uncertainty and anxiety a “Pynchon Risk” can make us feel in one area is easily “transferred over” to all the other areas of our lives (seeing as all our experiences are “unified” in us). In addition to this anxiety, the internet already makes us feel overwhelmed with a feeling that we are behind and need to see, watch, learn, read, etc. more, and then, on top of this feeling, we have politicians, groups, and even friends hurling at us a thousand claims we cannot determine are “false or true” without incredible amounts of research (and even then, there are no guarantees). The wonderful book Algorithms to Live By claims our problem today is “bufferbloat”:

‘The most prevalent critique of modern communications is that we are “always connected.” But the problem isn’t that we’re always connected; we’re not. The problem is that we’re always buffered. The difference is enormous.

‘The feeling that one needs to look at everything on the Internet, or read all possible books, or see all possible shows, is bufferbloat.

‘It used to be that people knocked on your door, got no response, and went away. Now they’re effectively waiting in line when you come home.

‘You are never, ever bored. This is the mixed blessing of buffers, operating as advertised.’¹

“Bufferbloat” is a great description of how we all feel today, and with us always already feeling like “we are behind,” when a manipulative politician comes along and claims, “You don’t know if x is true,” it’s easy for us to accept this accusation; after all, habituated, we “always already feel behind.” And so, a politician can easily add a few “Pynchon Risks” to our plate, “risks” which we are primed to accept given our “bufferbloat”-lifestyle. A single “Pynchon Risk” could cost us hundreds of hours to investigate (maybe thousands), let along five, as the manipulative politician or “special interest leader” very well might know. Manipulators who want to control us realize we’re all already “bufferbloated” and so unlikely to “look deeply” into their claims: we’re already just too overwhelmed. Thus, most people will either just be accepted or dismissed, but regardless the politicians will create uncertainty, and uncertainty is all the manipulative people need to expand control and influence. Sure, perhaps 99% of people won’t be phased, but in a world of billions, if 1% are won to the manipulators’ side or at least “moved out of the way with doubts,” that’s likely plenty.

Though “The Authority Circle” by O.G. Rose suggested that we should introduce “authorities” into a conversation less than more, please note that it is not the case that “all introductions” of authority into a conversation mean that “blurs” have been introduced. In fact, authorities could help us avoid “blurs” by indeed making it clear if x is true or false — the problem comes when authorities themselves are discounted. If we cannot trust any authority (or so we’ve been told), and x premise is introduced that we’d have to be an authority to either prove or disprove, then x is a “blur.” In line with “The Legitimization Crisis” that Habermas discussed, this suggests why the widespread distrust of authorities is so problematic: if most of what we know is thanks to authorities (as “shown” in “Ludwig” by O.G. Rose), then “most of what we know” suddenly becomes “a blur.” And in the resulting “hazy and confused state,” we find ourselves paralyzed and manipulatable, as politicians and opportunists likely realize.²

III

In Logic and Epistemology, we are taught that we aren’t responsible for “proving a negative” or investigating a position that a person doesn’t provide supporting evidence. Bertrand Russell famously made the example of a teapot orbiting the planet to suggest that we don’t “know for sure” that isn’t a teapot orbiting the planet, and yet we can’t reasonably be expected to therefore believe there’s a teapot orbiting the planet simply because we can’t disprove it. The logic Russell invokes here is used regularly in debates about God’s Existence, but it could just as easily be used regarding conspiracies, the assertions of politicians, and the like.

And yet, at the same time, “Russell’s Teapot” doesn’t feel up to the job, for whether or not Trump was working for Russia during his presidency is theoretically knowable, but problematically it seems only “theoretically knowable” to those in positions of power and who have abilities to investigate such a claim. It’s not like a teapot that even scientists cannot see, but rather it’s like a teapot that only scientists can see. This is very different from “Russell’s Teapot,” even if it seems similar. Epistemology and philosophy provide tools by which we can “avoid proving negatives” (in other words, we’re never responsible for proving “x is not real,” only for proving “x is real, per se), but I think we are desperately missing a category for avoiding “investing premises that are theoretically falsifiable but that are practically unfalsifiable (for the majority)” — this is what I want to call a “blur” (which corresponds with “Pynchon Risks”).

If we don’t trust the scientists (which perhaps is part of the claims of the manipulative politicians), that means we must become scientists ourselves, and then look into the claims. But even if it was possible for us to do this regarding one “Pynchon Risk,” it’s not possible that we could do this for all of them, and thus we are ultimate helplessness. But do note that our “ultimate helplessness” here doesn’t render the claims “necessarily false” — that’s the problem, and thus why all the existential anxiety comes pouring in if all we have is “the dichotomy of true and/or false.” If we can theoretically find out if x is true or false, then why don’t we? Because we’re suffering “bufferbloat?” Because it’s “practically impossible?” Even if that’s true, it existentially feels like excuses (and certainly the manipulative politicians are going to tell us we’re making excuses). And faced with this tension, lacking the language of “blur,” the existential tension roars in, precisely perhaps exactly as those in power want, because when we feel existential anxiety, totalitarianism becomes appealing (as discussed throughout “Belonging Again” by O.G. Rose).

In a conversation, a claim or “blur” that requires us “all to become scientists,” per se, is a claim that should be treated with less seriousness than a claim that can be examined without us launching a ten-year investigation, going off and reading ten books, and so on. No, this isn’t because “blurs” are necessarily false, but because “the conditions” we must meet for us to determine if they are true and false are so incredibly high that we can’t possibly do so without incredible effort that will take us incredible amounts of time. This means the claims make us vulnerable, for as we investigate the claims, the politicians can go off and do what they want, unimpeded while we are busy investigating. It is completely possible that a politician could utter “blurs” precisely to keep us occupied while they abuse their power, and since this is a realistic possibility, we are justified not to investigate the “blurs” and to instead put the burden of “epistemic responsibility” on the shoulders of the politicians. Especially if the points they are making “demand something of us,” then they must bring us better evidence, present us with “falsifiable” arguments, etc.; otherwise, we’re justified to move on.

IV

We might be tempted to speak “blurs” in our own lives precisely to avoid critique — a fear of vulnerability could be a reason we use them, as a political might use them as a way to maintain power. Either reason is a problem, though please note that I’m not claiming “a blur” should always be outright dismissed “as false,” but for that would be to treat a “blur” as a “falsity,” which would put us right back into the “dichotomy of true and false” that we’re trying to evolve and escape. To identify “a blur” must never become practically equivalent to saying, “That is false,” which I think is a mistake that’s happening right now that seems like it’s helping us escape the problem, but that is in fact only embedding us deeper into “the dichotomy of true and false” (which is perhaps especially problematic, for it feels like a solution when it worsens the problem). Rather, when we say, “That’s a blur,” we must mean something like “That is a claim that will end this discussion if we treat it with the same weight as the other claims we’ve made, so for now the claim must be off the table.” Also, at the end of a discussion, we could review and see that one’s person’s argument entailed two blurs (for example), while the other person’s argument entailed ten, and so we could decide to treat the argument of the first person “with more seriousness” than the argument of the second person. This doesn’t mean the second person was wrong, but it does mean the second person did not meet to the same “epistemological standard,” and we simply must give the first person “more weight” if we are to function in our Internet Age. Otherwise, we’re just too vulnerable to manipulation — such a “weighing system,” even when wrong, is “practically necessary.”

Please note that if a person utters “a blur” but then proceeds the next day to go write out the case, gather the evidence, etc., to unveil the “blur” as a truth or a falsity, then suddenly the “blur” becomes a “truth” — this “flip moment” is always possible, and why we can’t treat “a blur” as a simile for “a falsity.” People who utter “blurs” are not necessarily liars — they may frankly be appealing to completely valid authorities who are completely right (though it’s just not investigable at the moment of the discussion) — but it is still the case that “blurs” stifle discussions. “Blurs” should be avoided as much as possible, or at least acknowledged as “blurs” so that people in the discussion understand “the rules” of what is being discussed, and thus can all be on “the same page.” Furthermore, people after the discussion can know what they have a fair right to ignore, what they are responsible for proving, and so on. All in all, the language of “blurs” will help organize the discussion and emotionally stabilize the participants.

As has already been mentioned, if I utter “a blur,” I’m responsible for proving it: those whom I’m speaking with are not responsible for sorting out my “blurs.” When a manipulative person utters a “blur,” he or she may act like the listeners are responsible for investigating the “blur,” shifting the burden of proof and epistemological responsibility onto others, but the listeners should never allow this, and hopefully the term “blur” will help them not fall into being responsible for a “Pynchon Risk” because of someone else. If I recognize that a person is uttering “a blur,” then I can call it such and keep the “epistemic responsibility” on the shoulders of the utterer. If that person won’t do the work and can’t present the evidence, then I’m justified to ignore “the blur” and assume it was (practically) manipulation (otherwise, I’d fall into a “Pynchon Risk”). Maybe it wasn’t, but I am epistemically and “practically” justified to act “as if” that was the case, given that otherwise I would be vulnerable to epistemic manipulation.

Those who utter “blurs” are responsible for those “blurs,” and though it is the case that we cannot entirely avoid “blurs” (as we can’t entirely avoid “relying on authorities”), at least by keeping the responsibility for “blurs” “squarely on our shoulders,” we’ll be more careful when we introduce them into a discussion. We certainly won’t introduce them unless we’re prepared “to put in the work” they demand, and when we do, we’ll also do so with some healthy “reverential fear,” seeing as we toy with a “Pynchon Risk.” Sometimes, taking this risk can’t be helped, but if we realize how serious the risk is, we’ll be more likely not to take it unless it’s utterly necessary.

V

Is there a better term than “blur” for what I’m trying to describe? Maybe, but I like the term “blur” because it describes a claim which “blurs together truth and falsity,” or else makes things “too hazy to determine truth and falsity.” Also, “blurs” tend to “blur past us,” meaning they are said quickly and then the speaker moves on, because while we’re stuck investigating “the blur,” the manipulative politician who dropped it has moved on to fulfill his/her agenda while we’re still busy figuring “the blur” out (“Suskind Donkeys,” compelled by “epistemological responsibility”). Derrida can save us here, but we are yet to realize we need him, too busy descending into “the abyss.”

“Blurs” make it hard to “see clearly,” and in order to regain “clear sight,” we are motivated to launch a deep investigation into the subject matter to figure out “what is the case.” “Blurs” can be uttered swiftly and carelessly, and they ultimately “blur our sense of reality”; to regain a sense of “stable footing” (to, in a way, feel like “God isn’t dead,” to allude to Nietzsche), we have to take a “Pynchon Risk.” Without the language of “blurs,” I fear that the existential uncertainty of “blurs” will cause us to take “Pynchon Risks” that we shouldn’t, and then it will easily be too late for us. We’ll be gone, down into the abyss, like all those that have been lost to cults, Pizzagate, QAon, and worse.³

The more seriously we take “Pynchon Risks,” the clearer it becomes that we need “new heuristics” for operating in the world today beyond “the dichotomy of true and false.” That dichotomy can easily be used against us to trap, existentially overwhelm, and manipulate us, and the hope is that the term “blur” will help us avoid such a fate. At the very least, as recognizing “the authority circle” could help conversations avoid unintentionally becoming “authoritarian,” hopefully recognizing “blurs” could similarly help us avoid stumbling into “Pynchon Risks” with every other step. If that recognition were to come to pass, I can’t help but think the spread of conspiracies would finally begin slow.

Conspiracies, “Pynchon Risks,” “the problem of internally consistent systems” — all these unveil the inadequacy of traditional epistemology and logic. We need to evolve intellectually, and a good step would be to expand our thinking from “true and false” to “true, false, and blurring.” Without that third category, we are vulnerable to endless manipulation and existential anxiety, which will place us in a state in which all we feel like we can do is appeal to power for liberation from the anxiety. Feeling that way, we may feel exactly like those who uttered the “blurs” wanted all along. And then what will happen?

.

.

.

Notes

¹Christian, Brian and Tom Griffiths. Algorithms to Live By. New York, NY: Picador, 2016: 226.

²Worse yet, “authorities” might use “blurs” to their advantage, realizing that we need authorities to escape the manipulation of authorities (hence “the authority circle”), and thus that we are ultimately helpless and vulnerable. Can some people have this kind of powerful “authority” and not use it to their advantage? Let us hope so, but if power corrupts (and is more likely to corrupt the greater the power grows), then we are in trouble and need a desperate “shrinkage” of system size that might no longer be possible.

³Perhaps one day a computer with enough processing power will be invented that can instantly investigate all the claims we encounter? We in our finitude cannot investigate all “Pynchon Risks,” but perhaps supercomputers could, and then “the dichotomy of true and false” could be redeemed and not constantly make us vulnerable to manipulation. But that day is not here now, and believing in a “future salvation” could play into making us stay in “the dichotomy of true and false” today, always believing redemption is nigh.

.

.

.

For more, please visit O.G. Rose.com. Also, please subscribe to our YouTube channel and follow us on Instagram and Facebook.

--

--

O.G. Rose
O.G. Rose

Written by O.G. Rose

Iowa. Broken Pencil. Allegory. Write Launch. Ponder. Pidgeonholes. W&M. Poydras. Toho. ellipsis. O:JA&L. West Trade. UNO. Pushcart. https://linktr.ee/ogrose

No responses yet