Rationality is the master intellectual virtue, the one that subsumes all the others. (So if you are rational, you are intellectually virtuous, and vice versa.) But there is another intellectual virtue that is also extremely important, so much so that it also deserves a section of its own in this chapter. The virtue is objectivity.
Objectivity, like all other intellectual virtues, is part of rationality. The character trait of objectivity is a disposition to resist bias, and hence to base one's beliefs on the objective facts. The main failures of objectivity are cases where your beliefs are overly influenced by your personal interests, emotions, or desires, or by how the phenomenon in the world is related to you, as opposed to how the external world is independent of you.
For instance, when people hear about (alleged) bad behavior by politicians, their reactions are strongly influenced by which party the politician involved belongs to. When a politician from one's own party is accused of lying, of being inconsistent, or having sexual indiscretions, we tend to minimize the accusations.
We might insist on a very high standard of proof, or try to think of excuses for the politician, or say that sexual indiscretions are not relevant to the job. But when a politician from another party is accused of similar bad behavior, we tend to just accept the accusations at face value, and perhaps even trumpet them as proof of how bad the other party is. That is a failure of objectivity: The way we evaluate the cases is determined, not by the relevant facts of the case (what the person did, what the evidence shows), but by whether we think of the politician involved as "on our side" or not.
When we are being biased (non-objective), we usually do not notice that we are doing this, nor do we actively decide to do it. It just happens automatically — e.g, we automatically, without even trying, find ourselves thinking of reasons why the person who is "on our side" shouldn't be blamed for what he did. On the other hand, when a person "on the other side" is accused of wrongdoing, no such excuses occur to us. This is to say that bias is usually unconscious (or only semi-conscious), and unintentional (or only semi-intentional). That is why it requires deliberate monitoring and effort to attain objectivity. You have to stop once in a while to ask yourself how you might be biased. If you don't, the bias will automatically happen.
…
And those emotions affect your evaluation of the argument. They make you psychologically averse to thinking about the argument from your "opponent's" point of view. (I am putting quotes around "opponent" because we often think of those we are arguing with as opponents, but we really shouldn't do so. We should think of them as fellow truth-seekers.) You do not want to see the other person's perspective, so you don't.
What do you do instead? You might misinterpret the argument - in particular, you might interpret your "opponent" as saying the stupidest thing that they could possibly be interpreted as saying, and then respond to that. You might impose an impossibly high standard of proof for every premise used in the argument, using any doubt about any premise as an excuse for completely disregarding the argument. You might devote all your effort to thinking of ways your "opponent" could be wrong, while devoting none to thinking of ways that you yourself could be wrong.
Again, these are failures of objectivity: You let your treatment of ideas and arguments be determined by your personal feelings, rather than strictly by the rational merits of the ideas and arguments.
That is an excerpt from chapter 3 of Michael Huemer’s book, Knowledge, Reality, and Value: A Mostly Common Sense Guide to Philosophy. Reading it has helped me understand similarities and differences between objectivity, bias, neutrality, open-mindedness, and dogmatism.
Later in the chapter he goes into more detail on these topics. Here’s an outline:
3.2.2. Objectivity vs. Neutrality
Objectivity is not to be confused with neutrality. Similarly, being partisan is not to be confused with being biased.
"Neutrality" is a matter of…
3.2.3. The Importance of Objectivity
Why is objectivity important? Because failures of objectivity are…
3.2.4. Attacks on Objectivity
…
3.2.5. How to Be Objective
How can we work to be more objective? There are three main steps that I recommend…
3.2.6. Open-mindedness vs. Dogmatism
…
Having read this chapter, I now think a bit differently than I did before. For example, here are some thoughts I had the other day while making dinner:
Are you biased in anyway? Perhaps because you’re less familiar with the past, you believe the state of the world is really bad at present? Or, perhaps because you have suffered in certain ways, you see the world through jaded eyes? Maybe, you have an aversion to things you cannot control, hence you have a DIY bias?
Occasionally you might get glimpses of these biases, but you soon forget about them.
If you do have biases, what can you do about them? Should you do anything about them? Perhaps your bias is beneficial to you.
We all have biases. Understanding those biases helps us think better.
I love and endorse all the principles expressed here. Thanks for sharing. Being blind to biases can be a major problem. Just a minor quibble- I believe bias can in some cases (probably not most cases) be part of a rational thought process. Huemer gives the example of how we react to a politician being accused of a scandal. But what if, in another hypothetical, the person being accused is my father, a man I know from experience to be a man of high moral integrity. I would be biased in favor of my father, and I would have a bias against believing the accusation. Given my knowledge of the man, it would (rightly) require a high burden of proof before I would believe the accusation. I am clearly biased in favor of my father, but it would be a very rational bias. Do I have a valid critique of Huemer’s comments, or am I just using words differently?