In his famous essay “The Ethics of Belief,” W.K. Clifford argued that if a person allowed others to use a car that the owner knew was unsafe, even if the people arrived at their destination successfully and unharmed, the owner of the vehicle would still be guilty of immorality. When we know something is true and disregard it, or when we believe in something without sufficient evidence, according to Clifford, we act immorally. For Clifford, all of us have a burden of “epistemic responsibility” that we must bear well; otherwise, we fail to live the moral life.
Clifford’s overarching idea is immensely important, and I would like to expand on it here to claim that a failure to “think well” is an example of epistemic irresponsibility and immorality. If I avoid evidence that could counter my worldview, if I fail to try to understand fully those I disagree with (and instead stereotype or misrepresent them), if I only read books I agree with, and so on, I am epistemically immoral and irresponsible (and a threat to Pluralism and the Habermasian project, as discussed throughout the works of O.G. Rose). In such circumstances, I fail to “think well,” for a good thinker wouldn’t avoid ideas that could threaten his or her worldview, wouldn’t misrepresent disagreeable ideas he or she didn’t like, and so on. In such circumstances, I would be guilty of the trespasses Clifford wrote to stop.
That said, we don’t always realize when we mispresent those we disagree with, overlook ideas that could prove us wrong, and so on — the ways we “ideology preserve” are sometimes subconscious, unintentional, and subtle. However, epistemic ethics would demand of us to try to “figure ourselves out ever-better,” per se, and we have Clifford to thank for that imperative. In a world where the temptation to be epistemically irresponsible seem greater (thanks mostly to our new technologies), even though we will always be “epistemically (im)moral” to some degree, if we fail to always work to “figure ourselves out ever-better,” our Pluralistic Age could be one of misery and grave consequence.
Thinking entails responsibilities, but it isn’t always clear what constitutes those responsibilities. Clifford personally used his critique to attack religion, believing religion was epistemologically irresponsible by definition and hence innately immoral. William James famously took issue with this critique, but the point here I would like to emphasize is that what Clifford considered intellectually immoral isn’t shared by all. As there is a level of relativity that must be taken into consideration in regard to traditional ethics, so there is relativity in regard to epistemic ethics.
In “(Im)morality” by O.G. Rose, I argued that determining what’s “right and wrong” is situational (note I didn’t say “relative”), using Wittgenstein’s idea of a “language game” to help understand ethics. The paper also made clear my great skepticism of Ethics classes, but that shouldn’t be conflated with a disdain for all ethics (though I’m admittedly skeptical of general ethics). Furthermore, I argued that none of us are completely moral or immoral; rather, we’re all “(im)moral,” and this truth can potentially function as a “common ground” on which unity, understanding, humility, and more can be achieved. Similarly, when it comes to epistemic ethics, we’re all “(im)moral.”
If I am epistemically responsible in regard to gender, it doesn’t necessarily follow that I’m epistemically responsible relative to American history or politics; if I’m epistemically responsible relative to how my mother feels, it doesn’t necessarily follow that I’m epistemically moral relative to my dad. It is impossible for a person to know everything, and hence it isn’t possible for anyone to be entirely responsible and epistemically moral. We all fall short. Furthermore, because of “the phenomenology of (true) ignorance,” the “indestructibility of the map,” the difficulty of “learning to speak,” and other dimensions of epistemic life that are described throughout the works of O.G. Rose, it isn’t possible for any human being to be perfect all the time. Again, we all fall short: we’re all “epistemically (im)moral.”
That said, as there is moral imperative for us to work to be increasingly moral though we will always be “(im)moral,” so there is moral imperative for us to become increasingly responsible and epistemically moral. I do not believe this means we should all become positivists (as seems to me to be what Clifford thought), but rather I believe this means we must all work to ever-improve the life of our minds. This doesn’t simply mean we need to become smarter, if by “smarter” one means “memorize more information.” A lack of data isn’t the problem, but rather a lack of “tools” to sort through, understand, and be critical about data. Part of the problem is precisely that we know too much data, thanks to our modern technologies, which actually worsens our problem, because we lack the epistemic tools to handle it. Yet ironically, as we absorb more data, we feel more equipped to handle that information, and so open ourselves up to receive more.
If we exist, we will exist around others; hence, ethics aren’t optional for us; likewise, if we live in a society, epistemic ethics are also necessary. However, as argued in “(Im)morality,” since what constitutes being ethical is relative to the situation and its corresponding “language game,” per se, there isn’t a specific right answer for what in particularity we should think for every situation, ever time. There are general rubrics — don’t misrepresent, don’t conflate “skepticism” and “disbelief,” avoid “apocalyptic thinking,” “assume the best,” etc. — but that’s all, and considering this, philosophy is necessary for epistemic morality, for it is through philosophy that we learn “the art of thinking” (though I don’t mean to imply that philosophy is the only way to incubate “abstract reasoning”). Knowing how to think, there is a better chance we will know what to do in the situations we find ourselves in and with the data we absorb; lacking training, there is a much higher likelihood for problems.
“Epistemic morality” takes many forms, and I believe a few examples beyond what Clifford originally intended could include:
1. Representing ideas we disagree with accurately.
2. Communicating an idea as clearly as possible.
3. Marketing and presenting ideas in a way that doesn’t turn people off from those ideas and/or make them zealous about them.
4. Learning to listen.
5. Learning self-skepticism.
6. Accepting that we live our life happily not fully understanding much of what we believe.
7. Being open to change.
8. Not rejecting an idea because it’s hard to hear.
9. “Taking on” the case of those we disagree with to understand evidence, not just our case.
10. Developing empathy: the capacity to think/feel about the world through the mind/heart of another.
This is a very short list intended only to highlight ways in which we can cultivate epistemic responsibility. Ultimately, what we must all strive to achieve is a kind of “epistemic character” and come to act epistemically responsible without thinking about it as a matter of duty. According to James K.A. Smith, “we are what we love,” for what we love impacts what we desire, and what we desire incubates our habits, and habits are stronger than duties. Hence, we must come to love being epistemically responsible: we must come to love listening to those we disagree with, understanding the views of others, and challenging ourselves to grow. If we don’t — if we only do it out of a sense of Kantian duty, for example — we will only be changed in appearance, not in heart. And if our hearts don’t change, ultimately, our minds won’t change either.
The idea that there is an ethical imperative to cultivate the life of the mind is valuable, and we have Clifford to thank for it. We generally spend a good deal of focus to make sure we don’t accidentally hit someone with our car, to check that we don’t miss an item in the self-checkout line at the store, to avoid offending others, and so on. Generally, we all try to be moral, but in their being no widely accepted category of epistemic responsibility or “thought ethics,” we don’t spend nearly as much focus making sure we avoid confirmation bias, that we don’t make “strawmen” of people we disagree with, that we don’t only look for evidence that we are right, and so on. Where there is a lack of epistemic responsibility, there is no pressure to improve our intellectual abilities, to avoid logical fallacies, and to train our minds in general. Ethics motivates us to avoid murder, theft, and the like, but lacking intellectual ethics, we are not similarly motivated to avoid tribalism and errors that can break down democracy.
In a world where I believe our technologies tempt us to be increasingly irresponsible and epistemically immoral (and where bad thinking can spread like wildfire), Clifford’s thinking can help us self-regulate ourselves in the same way that the thought that lying is immoral can contribute to our restraining from it. It isn’t always obvious how this is accomplished (often being situational), and if we think it is, that notion in of itself is an example of epistemic irresponsibility and arrogance. Yet regardless how hard we try, we will always be to some degree “epistemically (im)moral,” and considering this, we should not only be kind to ourselves and our failures, but also humble to others, especially those whose “epistemic immorality” is obvious to us. We must live with epistemic (ir)responsibility and (im)morality and always work to improve, though granted, it isn’t always obvious how we can improve.
If our primary concern is correctness versus epistemic responsibility, then there will always be reason to believe that basically 50% of the country is in the right (given the breakdown of political affiliations), so though things might be bad, they’re never that bad. However, if epistemic responsibility is a concern of ours, and we realize that just because someone happens to be correct, it doesn’t follow that the person is necessarily epistemically responsible about being correct, we can start seeing America very differently. Suddenly, America’s situation can be remarkably dire, for it would be miraculous if even 10% of the country was epistemically responsible. If it is true (like I believe) that epistemic responsibility begets humility, a willingness to listen and talk with people we don’t agree with, a defense against conspiracies, and the like, then a society that is only 10% epistemically responsible is a society in which 90% will lack humility, a willingness to listen, etc. But has the majority ever been epistemically responsible? Probably not, which suggests why history so often seems to repeat.
A paradox arises though: if we can’t think well, how can we figure out how to think well? We lack the means to accomplish the end. Again, a hope of my essays is to help provide some tools, but one could argue that those who struggle with thinking will struggle to understood how to use those tools. Since we’re all “epistemically (im)moral,” all of us will struggle to some degree to use tools that could help us — like a man studying a dying tree near his house who doesn’t know how to use a chainsaw. And indeed, all of us will struggle, and perhaps nothing will come of it. But on the other hand, perhaps something will.