Ross Douthat recently published a piece titled “Decadence and the Intellectuals,” which immediately got my wheels spinning (I will put direct quotes in single apostrophes). Well, okay, the publication wasn’t that recently at this point, because I’ve been thinking about it for a month. Oh, how the days pass…
Decadence and the Intellectuals
The new Philip Roth biography, by Blake Bailey, has produced an outpouring of terrifically entertaining reviews. Here's…
Anyway, Douthat suggested that a “great novel” is one that ‘[makes] novels seem essential to an educated person’s understanding of her country,’ and suggested that ‘Toni Morrison might be the last ‘Great American Novelist.’ ’ I like this point, and it made me wonder if we could define “the great artists” as “those whom we cannot consider ourselves educated without experiencing.” This creates a simple test: when trying to decide if an artist is great, we should ask ourselves:
“Would I be educated if I never experienced x?”
“Would I understand the world and myself without y?”
Personally, as much as I enjoyed many of the books in literary fiction today, I wouldn’t be that different of a person had I never read them. But Dostoevsky? Well, that’s a different matter (I mean, who would I be without Dostoevsky?). And the worst part about it is that I wouldn’t even know what I was missing if I never read The Brothers Karamazov. Kierkegaard once said that people in despair don’t even know that they are in despair. With this in mind, perhaps another question we could ask to determine “the greats” in every field:
“Who does it terrify me to think I could have never encountered and never known who I missed?”
Ross Douthat alludes to Tanner Greer and Oswald Spengler, and Spengler, as of 1914, was already declaring Tolstoy (1910) and Marx (1883) as of ‘world-historical importance’:
‘Tolstoy was only four years dead when Spengler started his book; Marx was only 30 years deceased. But Spengler could state, with the full expectation that his audience would not question him, that these men belonged in global pantheon of humanity’s greatest figures.’¹
‘Is there anyone who died in the last decade you could make that sort of claim for?’² The question is left hanging, and though still living, I agree that Cormac McCarthy will be in the discussion (along with the poets John Ashbery and Louise Glück, if we’re just sticking to America), but the point is still clear: something at least feels like it’s in decline. This could be entirely wrong, and Douthat actually goes on to argue why he is now ‘more bullish on the novel’ — I encourage you to read his piece to learn why. Personally, I want to focus on the next topic Douthat brings up.³ He writes:
‘And then there are the intellectuals, where I think Greer’s analysis is all-but-inarguable. I can’t imagine anyone making a confident claim about contemporary philosophers and religious thinkers and would-be scientists of human nature that ranks them with Nietzsche or Marx or even Freud, with William James or W.E.B. DuBois, with Kierkegaard and Newman.’⁴
Can we think of any thinkers over the last decade who deserve to be read for centuries? Yes, I’m sure Foucault, John Rawls, and even Richard Rorty will be, but we’re talking here about thinkers who died as of 2010. Who should be read alongside Kant in 2200? Any nominations? Perhaps Quentin Meillassoux one day, but that will depend on how he completes his still-developing project. No doubt there are still great thinkers out there — the next Hegel could be next door, after all — but it does feel like something has happened institutionally and socially level that makes it harder for us to produce great thinkers, or at least to find and recognize them. Not because people today are dumber or don’t have what it takes, but because there seem to be barriers in their way.
Douthat proceeds to argue that he doesn’t think Jordan Peterson, Slavoj Žižek, Yuval Noah Harari, etc. will be around for the long haul (‘Will many of them adorn a Great Books curriculum in 2075, if such an antiquated thing exists? I’m doubtful.’), and whether Douthat’s right or wrong is not a debate I feel equipped to entertain.⁵ What I would like to discuss instead is why there might indeed be a decline in the quality of intellectuals. If this decline doesn’t actually exist, please ignore this work, and also please don’t mistake me as saying there can’t be great minds out there of which I’m ignorant. There easily could be, but I still think an examination of modern systems and our current zeitgeist could shed light on why an intellectual decline might be happening.
That said, Douthat argues that basically thinking wanes because society stopped being dialectical enough, and I think there’s a lot of truth to his point. He writes:
‘Which brings us back to the question of traditionalism and dynamism, and their potential interaction: If you’ve had a cultural revolution that cleared too much ground, razed too many bastions and led to a kind of cultural debasement and forgetting, you probably need to go backward, or least turn that way for recollection, before you can hope to go forward once again.’⁶
When tradition is too strong, new ideas will be found in those opposing tradition; when progressive forces are too strong, new ideas will be found in defending tradition (please see his piece for details).⁷ This strikes me as accurate, though below I offer four more factors that might be contributing to intellectual stagnation.
I. Derrida generally halted if not ended the ancient quest for “the great system.”
Why do we consider Marx and Hegel great thinkers? Well, part of it is that they created elaborated systems, and even if think those systems are wrong, they still did it, and the mental power that takes to produce is hard to question. Consequently, we’ve found ourselves with massive networks of interlocking concepts and ideas to explore, read, and reread for centuries to come. After Derrida’s deconstruction though, we can’t even try to build “a great system” without being laughed out of the room. But how many thinkers in the Great Book Series only wrote essays? This isn’t to say essays are bad, but there’s something to be said about the fact that a “great thinker” must be someone who can be constantly reread and interpreted anew. This seems to require length and scope, and it’s simply much easier to exhaust essays than entire books (even if those books are imperfect).
Additionally, if all a thinker does is present readings on past thinkers, what’s there to reread and reinterpret? The thinker has already done that “rereading”-work for us: all we’re supposed to do is just take in what they’ve discovered. Of course, we can disagree, but disagreeing with an interpreter is not the same as exploring a temple — which might collapse at any second (with one false premise, one false conflation, etc.). We perhaps must find a “thrill” in thinkers if they are to be great and continually explored.
Classical philosophers and theologians constructed systems, while scientists and psychologists generally write essays, and are none the worse for it, while the prestige of a philosopher who only writes essays can be negatively impacted. This is because much of the scientific work can be done external to scientists “on the field,” and that external work can “fill the gap” missing from the scope of their written work, making their overall work still feel vast. But this doesn’t seem true for abstract thinkers: to be great and reread, philosophers and theologies seemingly must make systems, and systems are now off-limits. As a result, it will be hard to find thinkers today who can be “great” and infinitely reread — unless, that is, we want to start allowing system-creation again, or perhaps come up with a “new way” to build great systems that nevertheless takes Derrida seriously.⁸
If we constructed a big system before Derrida (like Hegel), we seem to get a pass, but afterwards? There’s no excuse for it, but if “great systems” are the only way to be a “great thinker,” then becoming a great thinker is no longer permitted. Similarly, I think the effort for “great novels” has also been hurt by Derrida (not to say he intended such), and there’s a section in 2666 by Roberto Bolaño that captures the sentiment well. After asking a pharmacist ‘what books he liked and what book he was reading,’ the pharmacist answered the protagonist with a list of short stories.⁹ This depresses the protagonist, and he thinks:
‘Now even bookish pharmacists are afraid to take on the great, imperfect, torrential works, books that blaze paths into the unknown. They choose the perfect exercises of the great masters. Or what amounts to the same thing: they watch the great masters spar, but they have no interest in real combat, when the great masters struggle against that something, that something that terrifies us all, that something that cows us and spurs us on, amid blood and mortal wounds and stench.’¹⁰
Today, market forces also work against great novels and great systems of thought, and thanks to Derrida, we can even moralize the effort not to try. Perhaps in the past market forces didn’t hurt “the great novel” so much, because people didn’t feel they had to split their time between Netflix, music, and social media in order to “keep up with culture.” Additionally, people perhaps wanted a book “they could get lost in” for a long time (since books were harder to get), whereas today we can look to movies and long shows to scratch that itch. Lastly, “great books” tend to incorporate “big ideas,” and we just simply today don’t have the same appetite for “big ideas” (perhaps because we’ve gotten to a place where we feel like most “big questions” don’t have answers, and that the ones that do should be left up to the scientists). This isn’t an entirely bad change, for writers now tend to focus on “socially relevant topics” (which in the past we overlooked), but it does mean “great books” will be harder to write.
Am I conflating “great books” with “big books?” It can sound that way, and though I don’t think “size” and “greatness” necessarily correlate, I think they tend to (that it’s not mere privileging), precisely for the same reason great thinkers ostensibly must create systems versus write essays: the scope of possible interpretation, the fuel for rereading, must be vast for “greatness,” and there is a correlation (though no causation) between size and interpretive scope.¹¹
That all said, perhaps we’re better off without “big systems” and “big novels” (it certainly may help us avoid what I call “monotheorism”). That’s certainly possible, and that’s not an argument I’ll explore here, but if it’s true, then we’re better off for not having “great thinkers” and “great writers” like we did in the past. The concerns of Ross Douthat shouldn’t be concerns at all, and the issue is only that we must learn to live with our “loss of greatness,” realizing it’s for the best.
II. To be considered educated and permitted by institutions to pursue projects, what we think and create often must be verifiable, which is difficult and limits the range of topics that can be explored.
“Austin Farrer and the Problem of Verifiable Education” is a paper (with an audio summary) in which I argue that ‘college students today are often asked to […] refine their reactions to thinkers, to ‘iron out’ or explain the theology of Barth, for example, as opposed to explain Barth [in order] to ‘point to’ what he, Dante, Aquinas, Murdoch, and others climbed toward, what they were ‘on.’ ’¹² What I mean by this is that graduate students and professors are incentivized to become experts in the thinking of some great mind (whether Adam Smith, Karl Marx, David Hume, etc.), and though there’s certainly value in being mentored, for even Aquinas started out by working diligently on “the sentences” of Peter Lombard, that’s not all Aquinas did.
To be absolutely clear: I am not saying that a “Hume scholar” is necessarily someone who cannot be the next great philosopher (and so we cannot say at all that “because that person is a scholar of x, that person cannot be a great thinker”). In fact, there’s a strong argument to be made that if we don’t study the masters, we cannot become masters ourselves. The point I am making is that social and institutional structures today pressure us to stay as “scholars of” forever; we are generally not empowered to transition into our own work (and become “thinkers for”).
To expand on the point, here is Section II (lifted verbatim) from “Austin Farrer and the Problem of Verifiable Education” by O.G. Rose:
When Karl Barth wrote, there was no “Barth system” against which his writing could be “verified” and approved of: his ideas had to be read on their own terms and considered in light of their own intelligibility and depth, not to the degree his writings could be judged and “verified” as accurately representing and understanding “Karl Barth.” For Karl Barth to write was for Karl Barth to “stumble ahead” with only a “vague” sense of where his “fingertips” lead him and no possibility of verifying that he headed in any constructive direction at all. If graduate students are held to this standard, they will have no choice but to only pursue that which could be verified against a preexisting system, which isn’t to say this is a bad practice, but to suggest it shouldn’t be the only game in town. If it is, there’s no telling how many “Barths” we might miss out on.
Today, it’s as if “the great minds” in the past wrote something “vague” (without anything concrete to compare it against) in order to become “concrete standards” against which graduate students today could produce work that was verifiable and thus meaningful — a strange paradox and irony. Barth was “pulled ahead” by a desire to articulate and understand something vague, and consequently he changed the world. If graduate schools will not let students be “pulled ahead” by something they cannot define ahead of time or that cannot be “verified” against a preexisting system (because otherwise its “meaningless”), then intellectual progress in graduate school proves circular and arguably empty. Yes, if we don’t learn the thoughts of great minds, we’ll struggle to come up with great ideas ourselves and possibly repeat old ideas, thinking they are new, so certainly we want to be well versed in great works. But if we are not allowed to climb “the ladder” of great works, only collect the rungs as opposed to study the rungs so that we can determine how best to climb them, intellectual progress will likely stifle.
Also problematic, in order to be sure that my dissertation can be “verified” against it, I must know the “Barth system” extremely well, and so spend an incredible amount of time studying Barth (to “know what Barth thought” as opposed to “determine the truth,” which can of course entail Barthian thought). But even if I perfectly knew the thought of Karl Barth, I could never be certain that I knew the thought of Barth perfectly, and so I would potentially always live in a state of anxiety, reading and rereading Barth for years. Also, if someone disagreed with my interpretation of the “Barth system,” I couldn’t be entirely sure that the person who disagreed was wrong, potentially causing anxiety. I can never entirely “verify” that my interpretation of the “Barth system” is the correct one, but if I can only get my dissertation accepted to the degree it is “verified” against the “Barth system,” then I have to attempt the impossible, and there’s no telling how many years this attempt could take. Meanwhile, the great thinkers and minds probably don’t bother, and instead just focus on “stumbling ahead” toward “something more” — they likely use their time far better.
There’s a big difference between reading Karl Barth to “verify what he really thought” (which we should realize is ultimately impossible after David Hume and Karl Popper) and reading Karl Barth in hopes of using his work to help us understand “the truth” of the world us. Of course, if we practically don’t believe in truth anymore, then what else can we do with Barth other than memorize him, try to determine what he really thought, and so become a “Barth scholar?” Today, it’s as if we are all stuck becoming like the person who collects ladder rungs in Schopenhauer:
‘However, for the man who studies to gain insight, books and studies are merely rungs of the ladder on which he climbs to the summit of knowledge. As soon as a rung has raised him up one step, he leaves it behind. On the other hand, the many who study in order to fill their memory do not use the rungs of the ladder for climbing, but take them off and load themselves with them to take away, rejoicing at the increasing weight of the burden. They remain below forever, because they bear what should have bourne them.’¹³
To be fair, I believe no graduate school program intends to make these mistakes, but it’s an inevitable consequence of an educational system organized by the impossible goal of verification. Additionally, if we aren’t considered “educated” until we can verify ourselves as educated, then we will probably pour a lot of time into that effort, which means we will have to shape our thinking in light of something that already exists (a system of thought, a book, etc.), versus shape our own thinking. After spending all this effort on the former versus the latter, by the time the system finally permits us to “do our own work,” it’ll be too late: we’ll be too old or, habituated to years of illuminating the works of others, we’ll tragically have lost the capacity to take on our own labors.
How can anyone be “a great and original thinker” if practically the only way to get our thinking validated today is against preexisting systems? The “great minds” of the past created more than verified, and if the system makes it incredibly difficult to receive tenure, to get published, to receive financial support, etc., for primarily creating, then indeed, this will be an age lacking “great minds.” It’s just basic incentives.¹⁴
III. “Big ideas” can be seen as cementing systems of power and so are avoided.
After Foucault and the Frankfurt School, academics have spent decades suggesting that especially “big ideas” can be sources of oppression in establishing “epistemes” and/or “normalities” that exclude. I agree that this can happen and is a problem that needs to be critiqued and avoided, but the unintentional consequence of this can be an existential anxiety in thinkers that can make them hesitant to pursue “big ideas” ideas versus “smaller ideas” and/or critique ideas already in existence. Considering this, it shouldn’t surprise us that intellectuals now mostly focus on teaching “the history of ideas” or “reading old thinkers anew,” not wanting to risk contributing to systems of oppression. There’s safety in this, as there is safety in only positing “thought experiments” that don’t make any definite conclusions but just stimulate thinking. To be a “great thinker” though, it’s not enough to rehash the past, even if it is the case that many great thinkers were close readers of other great minds (take all the books Heidegger wrote on Kant, Nietzsche, etc.). Unfortunately, if institutions are also overly-concerned about the possibility of “big ideas” causing oppression, such thinkers may face bureaucratic scrutiny, as they may also face resistance from the larger society in general.
IV. Technology is often viewed as having won the debate over ideas on what makes the world a better place.
Philosophy has focused on the question “How do we live the good life?” for centuries, and though it can offer many answers, we today tend to instead inquire technology about how we should live. We just don’t think that ideas can help us live better lives (in fact, to echo point three, we think they oppress), and so we don’t invest in them: instead, all the great minds of our generation focus on becoming the next tech entrepreneur. And perhaps that’s wise: when in graduate school they discover the problems and pressures of “verifiable education,” they can begin looking for a field where their own ideas will be appreciated and even purchased.
‘[…] the modern world was made not by material causes, such as coal or thrift or capital or exports or exploitation or imperialism or good property rights or even good science, all of which have been widespread in other cultures and other times. It was made by ideas from and about the bourgeoisie — by an explosion after 1800 in technical ideas and a few institutional concepts, backed by a massive ideological shift toward market-tested betterment, on a large scale at first peculiar to northwestern Europe.’¹⁵
If this is true, then we today focus on technology over ideas to make the world a better place, when it’s ideas that lead to the eruption of technological progress — it’s like a good joke (in a Kafka way). Ideas grow “the artifex,” and the artifex “grows the pie,” but what I mean by this must be expanded on in “The Creative Concord” by O.G. Rose.
Well, even if there’s evidence to believe ideas have made the modern world, that doesn’t seem widely believed, and so the social and cultural incentives to become “a great thinker” are low. Again, why not just be a great inventor? We can philosophize all we like after our first Fortunate 500 company…
Closing Remarks on Aesthetics
Though it is all speculation, if a new “great thinker” emerges who is worthy of being in a “Great Book Series” come 2075, I believe it will be a thinker from the realm of aesthetics. Why? Because philosophers have mostly given up on the field. Elaine Scarry does great work, and I like Arthur Pontynen, but now that Roger Scruton is no longer with us, the field is bare. Why this means the field could be a source of greatness is, to start, because institutional interest and support seem weaker, and this means that if we try to do work in aesthetics, we’re likely aware we won’t receive institutional glory. Something else must be calling us, something perhaps like what called “the great minds” of the past.
I don’t think we’ve really begun to wrestle with Balthasar, and so even a student who was pressured by an institution to “verify Balthasar” would still have a good chance of becoming an original thinker. Equipped with unused tools and ideas, when that student did finally do his or her own work, that work would easily prove new. Additionally, the number of modern thinkers in aesthetics are limited, so studying aesthetics will require us to look to the past, which means we will likely cultivate the dialectic of “tradition versus progress” within ourselves that Douthat noted at the start assists progress.
The lack of institutional interest in aesthetics causes a kind of “self-selection” where those who enter the field are probably unique, and in that uniqueness more likely to seek “big ideas” and oppose institutional rewards. Aesthetics also seems uniquely “tied” to big questions which feed into ever-bigger questions: from asking “What is beauty?” we quickly get into questions of truth and the character needed to experience beauty, which sends us into the realm of ethics. Ethics leads to politics, and politics into just about everything.
Though I can imagine asking “What is true?” without also asking “What is good?” and “What is beautiful?” I find it harder to imagine asking “What is good?” without also asking “What is true?” But I find it most difficult of all to imagine asking “What is beautiful?” without also asking “What is true?” and “What is good?” — the aesthetic question strikes me as more readily entailing a vast multitude of philosophical questions, which necessarily creates dialectical tension, cross-pollination, and other conditions which increase the likelihood of creativity and original thought. This is because it is difficult to think of something beautiful that is ultimately false (as opposed to just “pretty”), as it is difficult to imagine experiencing beauty as undesirable (elaborations on this can be found in “On Beauty” by O.G. Rose). I’m not saying it’s impossible for beauty not to be evil and false (though I admittedly doubt it), but I am saying that the topic of beauty more naturally leads into questions of truth and goodness than questions of truth and goodness lead to the topic of beauty. The topic of beauty just seems fuller.
If the number of thinkers is more limited who pursue aesthetics, then thinkers who do so will be more forced to “think for themselves” than their contemporaries. They won’t have as many models to mimic, which though could be problematic and result in people getting lost down dead ends, could also be beneficial in that it forces people to think up new ideas. Where there is a lack of institutional support, either madness or conviction tends to push a person through.
Lastly, I think deconstruction may have run its course: as David Foster Wallace believed irony did the work it needed to do and that a time had come for a “new sincerity,” so I think we’ve done the needed work of “putting objectivity back in its proper place” but now risk going too far and destroying it entirely. This could make us susceptible to what Kierkegaard called “the infinite absolute negativity” (borrowing from Hegel); as explored in “The Trance of Believability” by O.G. Rose, what I mean by this is outlined here:
If I understand Kierkegaard correctly, I associate the IAN with “eternal regression” and the idea that irony can always ironize irony, ironize ironizing irony, etc., as cynicism can be cynical about cynicism, cynical about cynical cynicism, etc., as anti-politicians can be against anti-politicians, against anti-politicians who are against anti-politicians, etc. and so on. Like Nietzsche’s concern that we had ‘unchained the earth from its sun,’ Kierkegaard warns that if irony is “unchained” from constructivism criticism and becomes nihilistic and deconstructive, there is no stopping it…
Now, to give him credit, Derrida himself was adamant that deconstruction shouldn’t become nihilistic and stay “affirmative,” but I fear the followers of Derrida have not been able to keep the master’s wishes. And unfortunately, I’m of the opinion that the world is often more shaped by the followers of thinkers than the original thinkers themselves.
If it is true that deconstruction and even Postmodernity in general has reached the end of its road, then there will be a search for a new philosophical movement to take the mantle. Perhaps that could be the “speculative realism” of Quentin Meillassoux, but perhaps the next movement is yet to arrive. When people go to search for it, I think they’ll find aesthetics a field in need of tilling. Additionally, I think beauty could be uniquely positioned to put deconstruction “in its rightful place.”
Why? Well, beauty feels higher than subjectivity: it makes us feel subjective not by saying that there is no such thing as objectivity and that thus we must be subjective, but by suggesting that objectivity is “above us,” and that thus we must be subjective because we in ourselves lack something that is needed to perceive the world in its fullness. There’s reason to think we can’t “lack something” that doesn’t exist, and so beauty transforms the feeling “objectivity is not here” from “thus, objectivity is nothing” to “I can try to go to wherever objectivity is” (the conflation of “lack” and “nothing” has been extremely consequential, as discussed in “Lacks Are Not Nothing” by O.G. Rose). Even if I don’t reach the destination or the destination proves unreachable, the way beauty changes my orientation to life stands in stark contrast to the influences of deconstruction, which is a feeling for which I think people are looking. Seeing as I think aesthetics can uniquely provide that experience, I think people will harvest aesthetics passionately. Where there is passion, I think there also tends to be creativity.
At the same time, as discussed in “Beauty Saves” by O.G. Rose, “the sense of objectivity” which beauty provides is not a cold, hard, and oppressive absolutism beget by the Enlightenment (which Derrida was right to deconstruct); instead, it is an “open objectivity” that makes space for diversity and that resists being “lowered down” into our terms. Today, after Derrida, just any “sense of objectivity” will not do, for we do not have an appetite for exclusivity (and for good reason): we need an objectivity that feels more like “solid ground” than “solid walls.” This is where beauty is particularly advantageous, for it is primarily an experience versus an idea, and that means the full meaning of it cannot be fully captured by thought. Our “sense of objectivity” comes from our experience, and everything after that is subjective interpretation, which can play a role, but not with the same “divisive” authority as “Enlightenment objectivity” tends to entail (as described by Horkheimer and Adorno in Dialectic of Enlightenment). That doesn’t mean we can’t debate our aesthetic ideas, but it does mean no one will be sent to eternal torment if their ideas are wrong, and furthermore no one will have a “moral right” to force others to share their beliefs. At the same time, the very experience of beauty gives us reason to be humble. In this way, beauty can grant us a “reason to believe” in objectivity without the dogma.
Beauty makes “an objective absolute” seem plausible, as it also makes it plausible to believe that the absolute is good (after all, it’s beautiful). No, we can’t be certain that what we experience in beauty is a sign of “something higher” versus some kind of trick of our brains (certainty is mostly impossible, after all), but we can be confident. After years of Derrida, I think this will be a welcome relief, and in 2077, we will gladly consider “great” the thinker who helps us reach that oasis.
¹See “Decadence and the Intellectuals” by Ross Douthat.
²See “Decadence and the Intellectuals” by Ross Douthat.
³See “Decadence and the Intellectuals” by Ross Douthat.
⁴See “Decadence and the Intellectuals” by Ross Douthat.
⁵See “Decadence and the Intellectuals” by Ross Douthat.
⁶See “Decadence and the Intellectuals” by Ross Douthat.
⁷But this requires sociological “givens” to meaningfully define “tradition,” and if it is true (as argued in “Belonging Again” by O.G. Rose) that sociological “givens” no longer function very well, then tradition will prove weak and possibly too weak to operate dialectically against progress. As a result, if Douthat is correct, intellectual quality will drop.
⁸After Derrida, I think the thinker must make systems through essays that, in the end, suddenly transform into chapters of a bigger arch, “as if” they were always chapters. Derrida was right to be skeptical of extensive chains of deductions, but essays can help contain “the chains,” only bringing them together if something “emerges” from out of the internally consistent points of the essays. If not, the essays can still offer value: if they never “flip” into chapters, it is not the case that all is lost.
⁹Bolaño, Roberto. 2666. Translated by Natasha Wimmer. New York, NY: Picador, 2008: 227.
¹⁰Bolaño, Roberto. 2666. Translated by Natasha Wimmer. New York, NY: Picador, 2008: 227.
¹¹I’ve actually come to think that “the scope of possible interpretation” may grow exponentially, not just linearly, with book/system size. In other words, if the range of interpretation is 3 at 100 pages (“100 pages = 3”), at 200 pages the range is 8 (200 = 8), 300 (300 = 13), etc. versus at 100 = 3, 200 = 6, 300 = 9, etc.
¹²See “Austin Farrer and the Problem of Verifiable Education” by O.G. Rose.
¹³Schopenhauer, Arthur. The World as Will and Idea, Volume II. Translated by E.F.J. Payne. New York, NY: Dover Publications, 1966: 80.
¹⁴As thinkers like Hegel and Heidegger were “grandfathered in” before Derrida made “big systems” laughable, so too older thinkers like Charles Taylor, Judith Butler, Slavoj Žižek, and Jürgen Habermas may have gotten “grandfathered in” and allowed to primarily “do their own work” before the system of “verifiable education” spread. Frankly, I find it hard to imagine many more thinkers like them emerging in the world today. And even these thinkers haven’t constructed systems like Hegel did; in fact, most of Žižek’s work is just a reclaiming of Hegel in terms of Lacan (not that I dislike Žižek).
¹⁵Allusion to “The Great Enrichment Was Built on Ideas, Not Capital” by Deirdre McCloskey.