AN ESSAY FEATURED IN THE MAP IS INDESTRUCTIBLE BY O.G. ROSE

On Conspiracies and Pandora’s Rationality

O.G. Rose
32 min readOct 20, 2021

If truth is theoretically knowable but not practically, the black hole must be filled, and nothing will stop us from trying.

“Pandora’s Box,” by Charles Edward Perugini. Credit: Wikipedia/Public Domain. (Pandora is secularism, and inside the box is “unbound rationality.” Can we hope that truth will emerge from the bottom?)

This piece was inspired by a discussion between O.G. Rose and Lorenzo Barberis Canonico by the same name. Notes to the discussion can be found here, and below are also some relevant points:

Episode #36: Lorenzo Barberis Canonico on Conspiracies and Pandora’s Rationality

1. As discussed with Samuel Barnes on “Truth Organizes Values,” the dream of the Enlightenment was generally that there was only “one internally consistent system,” meaning that “coherence” of a worldview would necessarily correlate with its “correspondence” to reality. Unfortunately, it turns out that a system can theoretically be utterly “coherent” and non-contradictory, and yet nevertheless not “correspond” with reality at all. This being the case, there is no necessary reason why there cannot be an infinite multiplication of conspiracy theories, nor why “information warfare” cannot convince us of practically anything. If there was only “one internally consistent system which corresponded with reality,” then both conspiratorial thinking and propaganda would be much more “bound” and stoppable by rationality; instead, it turns out that rationality is mainly in the business of “coherence” versus “correspondence,” and thus rationality can make the problem worse.

Audio Summary
Discussion with Samuel Barnes on “Truth Organizes Values,”

2. Charles Taylor in The Secular Age discusses how secularism paradoxically caused a radical multiplication of religious beliefs, even though most people assumed secularism would eliminate religion from the world (James K.A. Smith also discusses this point). Indeed, there has been a “nova effect” like Dr. Taylor describes, but I believe religion is just part of what has really multiplied radically, and that would be “internally consistent systems” in general. Religions are indeed “internally consistent systems,” but so are ideologies, philosophies, and conspiracies. Secularism hoped “coherence” and “correspondence” correlated, but when Secularism rose to power, it found itself only unleashing “coherence.”

3. There are numerous ways a system can be “internally consistent” and yet not correspond with reality: for example, I could arrange a hundred true facts in a way that ultimately suggests something false, and yet each fact unto itself will be true and won’t necessarily contradict with one another. Also, if “the truth” of a situation entails a thousand facts, I could just discuss eight hundred facts, and thus discuss “all truth” without discussing “the truth.” In this way, it’s possible to never speak anything false and yet still not “correspond” with reality. Also, most religions entail no contradictions in their core tenants, and I would submit that conspiracies like QAnon also don’t ascribe to any premises which must be false. Yes, QAnon claimed that Trump would win reelection, and that hurt QAnon’s credibility to its followers, but notice how QAnon was instantly about to “spin the data” and provide a “reasonable explanation” for why Trump didn’t win. So it goes with all “internally consistent systems”: they can improvise and recover (especially if they avoid falsifiable claims).

Photo by Ehimetalor Akhere Unuabona

4. Systems which are “coherent” are systems which there is reason to entertain precisely because they are “coherent,” but then problematically that “coherence” filters our experience in favor of a system which might ultimately be false. But what can we do? To live in a world without coherence would drive us insane (as discussed throughout “Belonging Again” by O.G. Rose), and that means we must ascribe to a (true or false) “system” that helps the world “feel together.” We’re caught taking a risk that, if the risk doesn’t work out in our favor, we may not have the capacity to tell the risk has worked against us, our experience so filtered by “coherence.”

5. “The death of God” unleashed countless “internally consistent systems” upon the world: it seems rationality turned out to be a Pandora’s Box that “God” kept closed.

Photo by Wesley Tingey

6. Rationality could never save us from “The Problem of Internally Consistent Systems,” though we didn’t mind “killing God” believing it would eliminate all ideologies except “the truth.” As it turns out, far from solving the problem, rationality made the problem possible, for rationality is what gives us the capacity to establish “coherence” not “correspondence.” “Correspondence” seems possible thanks to experience which we can then “translate” into rationality, but rationality divided from experience is a force that greatly worsens the problem. And in a world where today we are all “heads on sticks” — disembodied, always on screens — experience is weak. Unfortunately, as Max Horkheimer and Theodor W. Adorno discuss, this plays into the hands of totalitarians, for all they need to do is create an “internally consistent ideology” in their favor, and there will never be any hope that rationality could “correspond” with a truth that undermines what they “consistently” establish.

When “God was alive,” per se, then rationality was always trying to conform to its belief in God, but now rationality finds itself trying to conform to something and yet doesn’t know what to conform too. Worse yet, rationality doesn’t really have the ability to determine what it should conform to, only find coherence relative to “whatever it is directed toward.” But this is very problematic, because we can easily make something false “coherent,” and so “if the true isn’t the rational” and rationality is incapable of determining “what is truth” (and relative to what coherence ought to be established), then rationality is “unbound” to make everything “coherent.” When “God died,” rationality was “unbound,” and “coherence was loosened upon the world.”

When we can determine truth (or think we can, at least), we have a standard to which to conform rationality and make it correspond and cohere accordingly. For rationality not to be a force of terror (as described by thinks from Hume to Adorno), we need “the three c’s”: conformity, correspondence, and coherence. Since rationality can never entirely conform to God (since God is transcendent), it is not the case that “conformity” and “correspondence” are necessarily identical, but rationality must still be trying to “conform” to God best it can as it increases “correspondence” along the way. So it goes with any “ultimate truth claim”: “the map can never be the territory,” but rationality can still try to “conform” the map to the territory best it can, increasing correspondence and coherence accordingly. At “absolute conformity,” “conformity” and “correspondence” practically become identical, but arguably we never reach that point, and so “conformity” and “correspondence” must always maintain distinction in our minds.

With “the death of God” went the possibility for a meaningful distinction between “conformity” and “correspondence”: now, the two are similes, for even if there are Ultimate Truths, we cannot “know them” (with social support) or provide ourselves reason to think we know them. Thus, we must try to correspond with a reality that we struggle to access because we are locked in our subjectivity, and so we can’t be sure if we ever “correspond,” and faced with the resulting existential anxiety, we’ve all naturally put our focus on “coherence,” but that means we’ve all contributed to “unbinding rationality,” and so we find ourselves overwhelmed by conspiracies and madness. When we believed in the Bible, for example, we believed that there was a God who tried to communicate with us “from outside subjectivity” — objectivity — and even if we still had to interpret what God said, we still believed in the possibility of making our minds correspond with reality. No, we could never “totally conform” with God (for we ourselves aren’t God), but we could at least try and always hold our “hopefully corresponding rationalities” with an “open hand” (seeing as it’s always possible for “God do a new thing,” etc.). In this way, we had to work to make our rationalities and theologies conform to God’s Will, correspond with God’s Design, and maintain coherence with God’s Word, and yet at the same time we always had to keep in mind that God could “breakthrough” and change everything at any moment. In this way, even if we did fall into a “problematic coherence,” it was always possible for that coherence to be broken apart. But not now.

When “God died,” rationality was unleashed to create as many “models of coherence” as it could, and so we have indeed undergone Taylor’s “nova effect,” which has meant religions have multiplied, but so too have all “internally consistent systems.” And rationality shows no signs of stopping. As argued in “The Problem of Internally Consistent Systems,” it is possible though that “internally consistent systems” are undergoing “inflation” and losing their “stickiness,” their ability to compel us — is that our only hope? Perhaps so.

“The Three Cs” — conformity, correspondence, and coherence — keep rationality from becoming a force of terror and existential anxiety. Two out of three won’t cut it: if all three persons of The Trinity aren’t present, God is an idol. As described in “Deconstructing Common Life” by O.G. Rose, here we can see why Hume was right to stress the need for rationality to “respect common life,” for that provides rationality something to which to “conform” (and by extension “correspond”), binding it healthily. Where rationality cannot “conform” and “correspond” though, Hume was right to warn that philosophy became “bad philosophy,” a force of destruction.

Can we believe in rationality anymore? If there is no truth, what else can rationality do but find “coherence?” And to the trauma of “God dying,” perhaps “overcompensating” in the direction of “coherence” was a natural response of rationality. When rationality was told the truth it always ascribed to was never really there, then rationality forsook entirely the effort to find “correspondence” and “conformity,” and instead went hard in the direction of “coherence” while telling itself that “coherence” was “correspondence” (that “the true was the rationality”). That way, rationality could never be hurt again…

David Hume also warned about what horrified Max Horkheimer and Theodor W. Adorno

7. Problematically, coherence feels like correspondence, and so it’s easy to believe we have correspondence when all we have is coherence. In this way, feelings work against us as we descend into an abyss, a “Pynchon Risk” (as described throughout The Conflict of Mind by O.G. Rose).

8. How do we evaluate values then according to other values that themselves must be evaluated? Ultimately, to determine which values we “ought” to ascribe to, we will need to determine “the truth,” for otherwise we cannot assess which values “align with truth.” “Truth organizes values,” as argued in The Conflict of Mind, which means if truth is indeterminable, values are indeterminable too. But that also means values are unstoppable, and if values would have us enter a conspiracy or fall into a “Pynchon Risk,” what could stop us? Truth?

9. If we “kill God” and abandon the possibility of “conformity,” we will still seek conformity, but instead do it with groups. It’s not by chance then that conspiracies have really taken off now that the internet has made it possible for people to easily find people who think like them. Generally, in the past, conspiracies tended to be believed by one person “all alone, standing for the truth,” and it was just a matter of probability that the person wouldn’t encounter anyone else in “the village” who shared the same “fringe views.” Now, if .1% of Americans believes in QAnon who are randomly distributed across the States, that’s around 300,000 people who can find one another and thus feel like 300,000 people. There is “epistemic strength” in numbers, and though before the internet that strength would have been missing because the 300,000 people would be scattered randomly across the country with little hope of ever meeting (they would have naturally fizzled out), now it’s unlikely the 300,000 won’t meet. Worse yet, the conspiratorial groups can get both the feeling of being “outcasts” and “inclusion,” which combined together immensely emboldens them and helps them endure.

Where “God is dead,” the only “conformity” we can hope for is “conformity with other people” (who are also locked in subjectivity from the world), and thus “group inclusion” is what our existential anxiety gradually starts to seek (perhaps when we begin to recognize “The Problem of Internally Consistent Systems,” as well as other epistemic problems “realized in” Neoliberalism, such as the existentially disturbing “Authority Circle”). If other people believe x, though y, x, and z are all equally “internally consistent,” there’s more reason to believe in x than y or z precisely because more people support x. The more people who believe something, the more we feel like we are “conforming to reality” in starting to think like them, because the feeling of conforming to reality and conforming to other people are “practically identical.” Today, since we can’t “conform to reality,” we no longer know what it feels like to so conform, and thus it’s easy to think “the feeling of conforming to others” is “the feeling of conformity to reality.” And as long as everyone in the groups claimsthey are “thinking for themselves,” we also don’t think we are falling victim to “group think,” especially if everyone in the group is rejected (as often happens with conspiracies). Do note that the more we suffer for believing something, the harder it becomes to ever admit it was wrong, and so as we are rejected for believing in x, the more likely we are to keep believing in x (a truth those who want to stop conspiracies should note).

Where the idea that it’s possible to “conform to reality” is lost (regardless if it actually is or not), so will rise the number of people who “conform to groups.” Since we don’t want to “groupthink” though, the groups we are likely to join are ones that “stand out as opposing groupthink,” and those groups are likely to be conspiratorial. And so our world today is made.

Photo by Rob Curran

10. To elaborate further on point nine, if “God is dead” along with absolute truth, then there is only “horizontal conformity,” per se, no “vertical conformity.” I can “conform” with other people, with society, and cultural practices that are “at my level” horizontally, but I cannot “conform” with reality or “the world itself” that “holds me up from underneath,” per se. Thus, if I’m going to “humbly conform” with the world, my only standard according to which I can say that I do so is by “conforming with people”: there ceases to be a meaningful difference between “conforming with people” and “conformity with reality.” People almost become reality, and hence if I’m going to “be real,” I have to “be like others,” suggesting that, after “God’s death,” our very epistemic drive to “conform with reality” contributes to increasing “conformity with groups,” which could lend itself in the direction of “groupthink” and “totalitarianism.”

When our beliefs don’t feel like they “conform” and/or “correspond” with reality, then our beliefs feel like things that we created, and in that way, they can feel arbitrary and empty. This is existentially unnerving, and arguably none of us can stand believing like our beliefs are “just beliefs,” suggesting a desperate need to feel like we are “conforming to something.” But in a world where “God is dead,” all we can conform to is one another. Thus, we are primed to be swept up into problematic groups, and in groups we can feel bolder than we should (as described in “How Freud Unites Inception, Hannah Arendt, and QAnon” by O.G. Rose).

“Belonging Again” by O.G. Rose is an extensive reflection on how we all want to “belong” but struggle to do so today with “the loss of givens.” We are always caught between wanting to be free but also wanting the world to make sense, and that requires us to make tradeoffs between individual autonomy and accepting a “background” and/or “social structure” that brings coherence to life but at the cost of various degrees of liberty. Where there is “background” or “givens,” we don’t have to think about so much, but that also means we are susceptible to dangerous “thoughtlessness” (which totalitarians love to exploit). How is this balance struck? Not easily, but every society must figure out how to succeed.

To capture the paradoxical balance, we all “want to be regulars but not regular”: we all want places we can go and “be one of the regulars” while we ourselves are not seen as “just a regular person.” We want to be familiar yet unique, someone who is part of the family but also still an individual. But today that is notably difficult, because Robert Putnam in Bowling Alone is right: our civic resources have collapsed. There aren’t bowling clubs or Lion Clubs we can join like there used to be, and that leaves us alone to “make community for ourselves.”

If civic communities collapsed and the internet was available to us but “God wasn’t dead,” that would be one thing, for we could still “fill our need to conform” by working to “match” our worldviews and minds with reality, which in turn would “bind us” healthily. The internet could then be used to determine reality and “absolute truth,” but now all the internet can generally be used for is to increase and intensify “horizontal conformity,” which means joining groups, possibly contributing to problematic and dangerous “groupthink.”

Rightly or wrongly, “absolute truth” once forced us to “check and balance” our desire for “belonging” against reality, so even if we wanted to join a group, if reality was x and the group stood for y, our knowledge of reality/x would keep us from joining the group. But now groups don’t have to worry about people refraining from joining them because of reality, which means groups are incredibly powerful (especially considering that they can also use “horizontal conformity” to their advantage). Will this power be used for good?

11. Even if “the death of God” doesn’t mean there is “no truth out there” to which we can conform, the world is increasingly complex by the day, and so whatever truth we might be able to find will be truth we cannot easily conform to, or truth that if we did somehow conform to, we couldn’t know we had accomplished this for sure (the complexity is simply too great). Under Globalization and Neoliberalism, systems are radically growing in complexity, worsening the dilemma of how our “epistemic responsibility” feels in constant conflict with “epistemic possibility.” (For more, please see The Conflict of Mind by O.G. Rose, notably the conclusion.)

Photo by Shahadat Rahman

12. If we could find “belonging” at a bowling alley or in our community, we might not have to turn to the internet to scratch that itch, which is to say we would not have to join “the collective consciousness” which I believe today is “made in the image and likeness” of someone like John Nash. This point will be expanded on shortly, but it’s basically to say that the internet is where we are simultaneously brilliant and schizophrenic (Deleuzian) for the same reasons. This poises a tremendous challenge, and suggests that as our collective brilliance skyrockets, so too will intensify our collective neurosis and susceptibility to “mass psychosis.” In our modern world, it’s easy to imagine more and intensifying “dancing plagues,” “witch trials,” “War of the Worlds broadcasts,” “laughter epidemics,” and “June bug epidemics,” just to name a few possibilities. To some, “mass psychosis”-events are evidence of “panpsychism,” but regardless it’s clear that the “internet” is “a fundamental and ubiquitous feature of social reality today,” which suggests that there is indeed a “collective consciousness” underlying our lives. For this reason, we are likely susceptible to “mass psychosis”-events in ways we can’t even possibility imagine. Though we tend to optimistically assume that the coming “Singularity” (“evolved collective consciousness”) will be a positive event by which we achieve god-like abilities, perhaps it will ultimately prove to just be the birth of “a psychotic god.” Hard to say.

The more civic resources efface and fade, the more we will have to turn to the internet to find “belonging,” and this will make the resulting collective consciousness all the stronger. I fear many examples of “brilliance” that mark human history highlight individuals who struggled with madness, but we have tended to understand this madness as accidental to genius and not essential, but if instead madness is part of the very fabric of genius, then growths in collective genius will likely also be accompanied by growth in collective madness.

As argued throughout O.G. Rose, we have told ourselves that “autonomous rationality” was possible, which would mean “madness was accidental to genius,” that rationality entailed no “essential limits,” and believing this there was no reason to worry that growth in genius would entail problematic and unintended consequences. But we have numerous examples throughout history where genius and madness have accompanied one another, and so we should not be quick to assume that a “Singularity” will not similarly unveil an extremely mixed bag. Perhaps the resulting good will outweigh the resulting bad, but I’m not so sure.

The fact conspiracies are spreading so rapidly today suggests that the internet is making our “collective conscious” like John Nash versus a perfect supercomputer incapable of madness. Does the good outweigh the bad? Hard to say.

13. “Hysterical contagion” can feel like being part of something, and if “social supports” are gone, what else do we have? We’re already mad anyway.

Photo by Ahmad Odeh

14. If rationality cannot save us from “The Problem of Internally Consistent Systems” and only God can, then with the loss of God, we are all now vulnerable to Iagos finding and manipulating us, and it’s only a matter of time before some Iago appears. Then, after we’re softened up, that Iago will make a “handkerchief” appear, and then what would “unbound rationality” have us do?

15. If an Iago tells us that our neighbor is sexually abusing children, what kind of monster must we be not to at least walk over and check? After all, how do we know our neighbor isn’t abusing children? Are we so epistemically immoral? And surely someone wouldn’t make such a claim unless there was something to it, right? (And, walking next door, that’s when we might find a handkerchief…)

16. “The Grand Technology” by O.G. Rose explores the problem where we are all “Sisyphus Karamazov” now, having to live like Ivan and tortured by the suffering of humanity around us, but unable to stop it. It is “practically inevitable” that there is suffering in the world every day, and even if suffering only happens to .0001% of the world (with a population around 7.753 billion), that’s around 7,753 instances of suffering that the media could report on and, through screens and images, force us to personally encounter and feel. No doubt, ten stories would be enough to fill a whole hour of news, and I think it’s easy to imagine that the news could find ten horrible stories each day to fill our lives with a sense of dread, existentially and mentally overwhelming us.

The feeling that the world is full of evil worsens the problem with conspiracies, because it feels like someone is behind all the terrible things happening (or so we may want to believe). Also, we hate feeling like there is nothing we can do to fix the problem. Wanting to fix the world, existentially overwhelmed by a feeling of helplessness, when conspiracies come along and suggest that we can finally do something, we leap at the opportunity. Additionally, if someone tells us that the government is secretly funding a ring of child sex abuses, are we really going to sit around and do nothing? Are we not at least as moral as Ivan Karamazov?

The media and “probability” assure that most of us will see and learn about incredible evils like child abuse, and it’s only a matter of time before a conspiracy comes around suggesting that people in power are trying to carry out a similar evil right now. After personally seeing the details of this evil, are we really going to sit around and do nothing? We know better.

Audio Summary

17. A “stand alone complex” is an extremely useful “mental model” for understanding conspiracies, and we have Ghost in the Shell to thank for it (the Japanese seem to have been aware of “the problem of internally consistent systems” long before us Westerners). To explain, consider the following image:

The image above is what I will call a “Traditional System With a Center.” The center and surrounding circle share the same color, ultimately all part of the same thing, but there is still a center that can be designated apart from everything else. Theoretically, if the center was removed, the whole system would fall apart; if the center was corrupted, there would be reason to think the whole system was likewise corrupt.

Now consider a “stand alone complex”:

There is no center, and there isn’t even “no center,” per se, but instead just a “negative space” that results from what encircles the center. But the “encircling” necessarily makes it “appear like” there is a circle, as the FedEx logo makes it “appear like” there is an arrow (and in a sense, there is, just not “actually”). Within “the standard alone complex,” rationality seems forced to conclude there is a center, but since there isn’t, as rationality searches for that center, those outside the center can continue about their business, unhindered. In this way, a “stand alone complex” is incredibly difficult to stop, for whereas to stop traditional systems we simply need to deconstruct the center to take out the whole, here we can only deconstruct the whole, which can always reconstitute itself, because there is no center we can erase to make that reconstitution impossible. In this way, “stand alone complexes” may even be invincible: our only hope is that those “encircling” eventually just lose interest or something (“stickiness” wanes, as discussed in light of Charles Taylor’s “nova effect”), but how is that possible? Is totalitarianism necessary to “stomp it out?” How is that an improvement?

Trailer: For more on SAC

(The work of Satoshi Kon is also useful for understanding SAC-like events, mainly Paranoia Agent.)

18. Describing John Nash, the great mathematician who struggled with schizophrenia, Sylvia Nasar stresses on numerous occasions in her biography, A Beautiful Mind, that the mental illness Nash suffered was not a result of a diminishing of rationality, but instead resulted from his rationality becoming unbound. In other words, the rationality becomes too powerful, per se, and it ceased to exist in a balance with other mental faculties. This is discussed in “Why Do Madness and Genius Like to Tango?” by O.G. Rose, but to quote Louis A. Sass, Nash underwent ‘a heightening rather than a dimming of consciousness awareness, and an alienation not from reason but from emotion, instincts and the will.’¹ ‘Rather than cloudiness, confusion, and meaninglessness, [in schizophrenia] there is hyper-awareness, over-acuity, and an uncanny wakefulness. Urgent preoccupations, elaborate rationales, and ingenious theories dominate.’²

Why is this important to get straight? Because the internet is generally making us more informed, more likely to be thinking, and seems to also increase how often we need to approach the world “rationally.” If mental illness can result from too much rationality, then this all means the internet is making us mentally ill because of its virtues, not just its vices. We tend to praise the internet for helping us be informed, giving us access to mass quantities of data, and the like, while at the same time acknowledging the dangers of TikTok, social media, and other sources of addictions, self-hatred, etc. But it’s not so simple: what we consider “good” about the internet could ultimately be just as much a problem as what we consider bad.

It seems to me that the internet is making our overall “collective consciousness” something resembling the mind of John Nash. We are simultaneously hyper-brilliant and struggle with severe mental illness because of that hyper-brilliance: our collective schizophrenia is “practically our collection brilliance. After all, what is genius but a new voice?

19. “Truths, Falsities, and Blurs” by O.G. Rose defined a “blur” as a premise which is “theoretically knowable but not practically,” and as the world becomes more complex, the number of “blurs” out there radically increases. Most conspiracies consist of “truths and blurs,” not “falsities” — a common mistake — and there is something about “blurs” which tends to drive us mad. “Blurs” are when we know something is knowable, and yet we lack the power to know it — a hard reality to live with and accept.

20. Inspired by Eric Jobe, “How Freud Unites Inception, Hannah Arendt, and QAnon” by O.G. Rose argued that “group psychology” is identical with “dream psychology,” and “dream psychology” is identical with “movie psychology.” For Freud, dreams are when our subconscious desires manifest and are “released,” which is to say dreams are when we can release ourselves. If this is the case, it is in groups that we feel “released” and “like we can do anything” (after all, we feel like we’re in a movie). Is this good or bad? Well, it depends.

When civic resources were vaster, then we could find community in “groups” that were strongly tied to the communities in which they were situated, and those communities “bound” the groups in heathy ways. Now though, groups are online and unattached to any specific communities; as a result, the feeling that members can “do anything they want” is far more unleashed. And if the members think there is a global conspiracy trying to take down America, then they will by extension feel like they can stop it and moreover should. As already mentioned, in the past it tended to be isolated individuals who ascribed to conspiracies, so even if they thought there was a global conspiracy working against America, they wouldn’t necessarily feel like they could do anything about it. Now though, conspiracy theorists are finding groups in which they can “feel like they are in a movie” and capable of anything. And believing the world hangs in the balance, what shouldn’t they try to do (especially if their group is a “stand alone complex” and “practically invincible”)?

21. “The Authority Circle,” the dilemma that we need authorities we have reason not to trust in order to determine what to think, is an inescapable dilemma that can forever fuel “conspiratorial thinking.” But if institutions had not undergone “the legitimization crisis” at least (as described by Habermas), perhaps “conspiratorial thinking” would be bound and contained? But how can any institution maintain “legitimacy” in a world with the internet?

22. Instead of 1984 or Brave New World, it is Kafka’s The Trial.

Brian Artese — Tube o’ Theory: Amazing Lecture Series

23. The work of Dr. Zach Stein is extremely relevant here, who is an expert on “information warfare” and propaganda, both of which contribute to conspiratorial thinking. According to Dr. Stein, we are entering an age where “the weapons of mass information warfare” are so strong that they are the equivalent of nukes: if nations don’t agree to simply not use them, we will destroy ourselves. Dr. Stein warns that we are causing mass and widespread insanity, which might sound strange until we understand that “insanity is the inability to identify reality,” and suddenly it’s clear the label applies. Stein warns that “information warfare” is basically always happening, but today the impacts are so constant and powerful that it’s nearly impossible to remember what life was like before constant information warfare. Stein also makes the point that our leaders are just as susceptible as we are to “information warfare,” so it is not the case that we can trust that our leaders “have the situation under control” and will handle this technology to avoid its negative influences. If the information is driving us insane, it’s also driving our leaders crazy. Yes, perhaps nukes drive our leaders crazy with pressure, but that kind of crazy is very different from an insanity that robs from our leaders the ability to tell truth from falsity.

Dr. Stein suggests that telling the difference between “education” and “propaganda” is extremely difficult, and in his discussion with Jim Rutt, Dr. Stein traces out some insightful characteristics of an “educational program” compared to a “propagandist program.” For me, the discussion suggested that perhaps the only education we can rely on is a “philosophical education” (mainly learning how to think), because it almost seems like all other topics would inevitably fall back onto “relying on authority” (thus the dilemma of “The Authority Circle”) and thus make us susceptible to propaganda. The work of O.G. Rose indeed suggests that philosophical thinking is primary in a good education — the ability to think “between subjects” versus “focus on a subject,” per se — and I think this is an implication of Dr. Stein’s work.

The difference between “education” and “propaganda” for Dr. Stein is a matter of structure, not truth — a key point. We often think that x is “propaganda” if it is false (and people are claiming we should believe it), but then that means “propaganda” is always something that other people do, because by definition we must think we believe in the truth (otherwise, we wouldn’t think like we did). By extension, this also makes us immune to believing in propaganda or spreading it. And so everyone goes around spreading and participating in propaganda all while they think they are fighting propaganda with truth: it never occurs to them that we can spread propaganda by spreading truth if the structure of the information is wrong (as Dr Stein describes with Jim Rutt). Some examples of structural problems: there’s a power hierarchy involved that the teacher wants to maintain with the student versus bring the student to the teacher’s level; if the facts arranged in a way that is “entirely factual” and yet misleading; and so on — Stein illuminates how propaganda wants to maintain the “asymmetrical information gap” between people versus erase it (as education seeks to do).

It would seem to me that conspiracies require “asymmetrical gaps,” for people explore and investigate conspiracies precisely because they feel like there is a “gap” that is being maintained between them and “the truth.” If this is the case and our age of “information warfare” is one where “asymmetrical gaps” are created, multiplied, and spread, then this is an age that will likely continue to be an age troubled by conspiracies.

24. If we believe there is a God who is active in history and ultimately in control, the incentive to worry about conspiracies might be lower. If “God is dead” though, how can the world run itself?

Photo by NASA

25. Generally, our world believes in rationality but not truth, and yet rationality is impossible without truth. And that just means we end up operating according to a truth we don’t realize is present, “in the background,” and that truth just happens to reflect the socioeconomic order, institutional structures, etc. of the day. In America, that generally means our rationality is shaped and “organized as itself” by Neoliberalism.

Philosophy is supposed to train us to realize that “the true isn’t the rational” and that what we believe is true determines what is rational; unfortunately, that means we can “act rationally” and still “be wrong.” Thus, philosophers are always trying to “think beyond their rationality” and the coherence their rationality presents them with, to see “the world past their worldview,” per se. Now, we can never escape a worldview, and even if we “saw the world,” we couldn’t know for sure that we did (considering Gödel), but the philosopher still knows we must try. There is a nobility in the doomed effort, and democracy is certainly doomed with citizens don’t participate in that attempt.

If we have been trained to believe “in rationality but not truth,” and if it is the case that philosophical skill is not prioritized, then we have been made susceptible to conspiratorial thinking. We are equipped with rationality and rationality alone, and if the premises of a conspiracy reach us, we have no way to keep our rationality from “pulling us into” those premises (other than say luck or emotion). To stress the point, if a “Pynchon Risk” comes our way, our rationality will easily pull us into it, and once we cross the “event horizon” of that black hole, per se, what can we do? Tragically, a world that doesn’t make a meaningful distinction between “true” and “rationality” is a world that’s necessarily efforts for “coherence” cannot be corrected and balanced with “correspondence.”

26. Every rationality is a rationality (relative to a truth), and that means every rationality is prone to tribalism, suggesting the desperate need for an ascent to “(greater) truth” (as distinct from rationality). Rationality, in being primarily in the business of coherence, is in the business of “walling us into a system” that we never look outside of or beyond. Once we achieve an “internally consistent systems” (or “fixed belief,” as C.S. Pierce discusses), rationality just wants to “keep us being as we are now.”

The Fixation of Belief — Charles Peirce

Why? Four reasons immediately come to mind:

A. Once we achieve the existential stability of a belief we can believe in (which feels “given”), all venturing outside our “system” into difference, new ideas, diversity, and the like can do is ruin our “existential stability.” Learning “ultimate truth” seems impossible for finite beings, so when we learn we are mostly in the business of gaining “sufficient reason” to believe something, not “absolute certainty.” Once an “internally consistent system” provides us with “(a sense of) sufficient reason,” considering how difficult finding an “internally consistent system” can be, our brains will fight tooth and claw to keep it.

B. We are alive right now, so if the brain can convince us to stay in our current state, there’s a good chance we’ll keep being alive. On this point, the brain favors “being” versus “becoming” in general, because if we can keep “being” as we currently are, then we’ll keep “being” alive. The brain does not like risk: if we are alive now, why not just maintain our current state? What can “becoming” give us?

C. An “internally consistent system” is a fantastic foundation for a communal identity, and where there is a community that cares for us, there is protection, access to resources, and acceptance. To question the communal foundation is to not only risk losing all this, but to transform friends into enemies who use their resources against us. In general, we risk numerous relationships when we seek what is beyond the walls of our systems, and if René Girard is right about how a massive percent of our identities is a result of “mimetics,” then to lose our relationships would be for us to lose our “role models,” which is to say our “models for determining our roles and how we should act,” setting us up for great confusion and existential anxiety. We need a “truth” to organize our values (as discussed in The Conflict of Mind), but whenever we believe something by ourselves, our strength in that belief wanes profoundly, similar to how our liking of a movie falls if everyone around us dislikes it (points explored profoundly in “The Origin of Envy & Narcissism” by Psych Reviews). Considering this, to “look beyond” our system is to not only risk our standard for organizing our lives, but it is also to set ourselves up to have to believe in a truth that we’ll receive little if any social support to maintain and strengthen. Finally, if René Girard is right about the critical role of the scapegoat in the formation of society, to question our system might be for us to set ourselves up to be used as a “scapegoat.”

D. Thinking about “the truth” uses up a lot of energy, and the brain like saving energy as much as possible. Furthermore, why use up so much energy for such a big risk to achieve a goal we cannot even know we achieved if we did (because certainty is impossible)?

Reviewing these points, perhaps we can’t be upset with our brains for “naturally” seeking to preserve and maintain our “internally consistent system” (it’s not crazy). However, this perhaps wise and practical tendency of the brain also makes us all vulnerable to falling into conspiracies, cults, and the like. Indeed, the brain is a frenemy.

The Origin of Envy & Narcissism — René Girard

27. The most popular shows over the last decade have all had “game” in the title — Hunger Games, Game of Thrones, Squid Game — what does this say about us? Do we resonate with stories that suggest all of life is a game (and, worse yet, a game we are losing)? After “the death of God,” perhaps it’s natural to think of everything as “ultimately arbitrary” and thus little better than a “game?” If this is the case, then we’re already primed to treat life like “a game,” and so when a conspiracy comes along that takes “the game of life” to the next level, gamifying everywhere, we’re ready. After all, most of us grew up on videogames: we know what to do.

Source

28. The incredible piece by Reed Berkowitz called “A Game Designer’s Analysis Of QAnon: Playing with reality” is extremely useful for understanding how conspiracies form, work, and sustain themselves. I will not recite all the arguments here (please go read the masterpiece for yourself), but I will highlight a section. The writer has designed a game, and the game is being tested. It is a puzzle game, and players are trying to escape a room. They notice something on the ground; the author writes:

Apophenia is: “the tendency to perceive a connection or meaningful pattern between unrelated or random things (such as objects or ideas)”

As the participants started searching for the hidden object, on the dirt floor, were little random scraps of wood.

How could that be a problem!?

It was a problem because three of the pieces made the shape of a perfect arrow pointing right at a blank wall. It was uncanny. It had to be a clue. The investigators stopped and stared at the wall and were determined to figure out what the clue meant and they were not going one step further until they did. The whole game was derailed. Then, it got worse. Since there obviously was no clue there, the group decided the clue they were looking for was IN the wall. The collection of ordinary tools they found conveniently laying around seemed to enforce their conclusion that this was the correct direction. The arrow was pointing to the clue and the tools were how they would get to it. How obvious could it be?

I stared in horror because it all fit so well. It was better and more obvious than the clue I had hidden. I could see it. It was all random chance but I could see the connections that had been made were all completely logical. I had a crude backup plan and I used it quickly before these well-meaning players started tearing apart the basement wall with crowbars looking for clues that did not exist.

The article really is brilliant, and it is required reading on the topic of conspiracies, as is the paper titiled “Conspiracy Theories” by Cass Sunstein and Adrian Vermeule.

To add to the point, “the videogame problem” described above makes it clear that “rationality always operates relative to what it is given.” The players of the videogame chose to enter into the game, and in so doing, agreed to have their rationality organized and shaped by the game. If they didn’t, there would be no way they could play the game: there would be a radical mismatch between how the game worked and what the players believed they could do. “The truth of the game” and “the rationality of the players” had to align, for otherwise there would be a resulting “disjoint” that made operation and “playing” impossible. But do you see the joke here? What would have saved the players from searching a wall needlessly would have been a breakdown of their rationality, and why in the world would the players ever believe they needed that? It seems impossible, outside the scope of the thinkable, and that means they were trapped (like Made in Abyss).

Yea…

Only a “change in the truth” beneath their rationality could have saved them, and that would have required the game designer to change something, and he could not. “God was dead,” I guess we could say. Alternatively, the players could have been saved by an act of “nonrationality,” but they had no reason to think they needed “nonrationality.” After all, the arrow on the ground was clearly an arrow.

Episode #10: Lorenzo Barberis Canonico on Neurodiversity, Collective Intelligence, and Game Theory

(On further evidence on how rationality is involved “within” frameworks, please see “FFX Was Finally Broken After 20 Years By Speedrunners” and “The History Of Chrono Trigger’s Most Broken Glitches — Speedrunning Explained.” What you witness in these videos is brilliance: don’t let biases in favor of “academic subjects” cloud your judgment.)

Perhaps this “videogame scenario” makes the “possible disjoint” between truth and rationality clear, but the same point applies to ideologies and worldviews just as easily. The issue is that we don’t usually realize we are in a “game,” per se (though perhaps that is changing, as this work has suggested), which is to say we are “in a worldview.” Like the gamers who chose to play the puzzle game, we make a similar choice when we step into and/or “absorb” an ideology: we agree to have our rationality founded and organized by that “system.” And guess what? That means we are always at risk of being trapped. ‘Try to stay awake.’³

29. To be right about a conspiracy is to have our epistemological and rational capacities forever compromised, Lorenzo points out: the dopamine release is just too great. To enter into a conspiracy and prove victories is to never escape conspiratorial thinking: if taking a Pynchon Risk proves wise, the Pynchon Risk proves fatal. But perhaps this is exactly what those in power know, and they use their truth to make themselves unstoppable. How can we be so sure they don’t?

30. Lorenzo Barberis Canonico brilliantly suggests that “skin in the game” is a way we stop conspiracies, which is to say that people won’t be so quick to join conspiracies if they have to “put money on the line” and bet they are right. When the rest of the population sees how willing/unwilling people are to put their money on their beliefs, that will dramatically impact how readily the beliefs can spread. “Loss-aversion is the only thing stronger than self-deception,” he points out. I agree, which is why his project gives me hope.

Episode #20: Lorenzo Barberis Canonico on Financial Epistemology

.

.

.

Notes

¹Allusion to Louis A. Sass, as found in A Beautiful Mind by Sylvia Nasar. New York, NY: Simon & Schuster, 1998: 18.

²Allusion to Louis A. Sass, as found in A Beautiful Mind by Sylvia Nasar. New York, NY: Simon & Schuster, 1998: 324.

³Allusion to “Up, Simba” by David Foster Wallace.

.

.

.

For more, please visit O.G. Rose.com. Also, please subscribe to our YouTube channel and follow us on Instagram and Facebook.

--

--

O.G. Rose
O.G. Rose

Written by O.G. Rose

Iowa. Broken Pencil. Allegory. Write Launch. Ponder. Pidgeonholes. W&M. Poydras. Toho. ellipsis. O:JA&L. West Trade. UNO. Pushcart. https://linktr.ee/ogrose

No responses yet