I love and endorse all the principles expressed here. Thanks for sharing. Being blind to biases can be a major problem. Just a minor quibble- I believe bias can in some cases (probably not most cases) be part of a rational thought process. Huemer gives the example of how we react to a politician being accused of a scandal. But what if, in another hypothetical, the person being accused is my father, a man I know from experience to be a man of high moral integrity. I would be biased in favor of my father, and I would have a bias against believing the accusation. Given my knowledge of the man, it would (rightly) require a high burden of proof before I would believe the accusation. I am clearly biased in favor of my father, but it would be a very rational bias. Do I have a valid critique of Huemer’s comments, or am I just using words differently?
Yes, I agree with your example, but it really depends on the evidence. If it were just an accusation, sure, you're going to side with your father, but if the evidence begins to mount against him, your bias will come to light. Others will start to side with the evidence, but you will stick with him until the evidence is beyond a reasonable doubt for you. That's really what we're talking about here. There has to be more than just an accusation for us to weight the bias. There has to be evidence.
Are there cases where we might actually choose to be biased, knowing that we're doing so deliberately even in the face of significant counter evidence? Can you think of any examples?
Notice that this is a bit different than what Huemer is trying to teach us, i.e. basic bias. For example, let's assume that Huemer is an expert on bias and that he has a special power to be aware of all of his biases. But one day, he decides that he wants to indulge a particular bias in order to obtain a certain outcome in his life. We might call this purposeful bias or enlightened bias. Why might he do this?
I think what I am describing is Baysian reasoning model. We start with a prior and go along updating as we encounter new information. If we build up enough information confirming of a certain point of view then we can say we have a “strong prior” (for example that my father is a moral guy with high integrity). If there is a strong prior, then it would require very strong contradictory evidence before a Baysian reasoner would update to the new point of view. This is identical to saying the Baysian reasoner has a bias toward the former view. My point is just that such a bias is not irrational or un-objective. In fact bias can (and should!!) be part of a rational process of knowledge-seeking, so long as we understand our biases and respond to new information in a way that deliberately takes into account our existing biases (priors).
(I could add that it is also important that we don’t build up our priors based on pure emotion or other logically fallacious sources, but I think that goes without saying.)
I love and endorse all the principles expressed here. Thanks for sharing. Being blind to biases can be a major problem. Just a minor quibble- I believe bias can in some cases (probably not most cases) be part of a rational thought process. Huemer gives the example of how we react to a politician being accused of a scandal. But what if, in another hypothetical, the person being accused is my father, a man I know from experience to be a man of high moral integrity. I would be biased in favor of my father, and I would have a bias against believing the accusation. Given my knowledge of the man, it would (rightly) require a high burden of proof before I would believe the accusation. I am clearly biased in favor of my father, but it would be a very rational bias. Do I have a valid critique of Huemer’s comments, or am I just using words differently?
Yes, I agree with your example, but it really depends on the evidence. If it were just an accusation, sure, you're going to side with your father, but if the evidence begins to mount against him, your bias will come to light. Others will start to side with the evidence, but you will stick with him until the evidence is beyond a reasonable doubt for you. That's really what we're talking about here. There has to be more than just an accusation for us to weight the bias. There has to be evidence.
Are there cases where we might actually choose to be biased, knowing that we're doing so deliberately even in the face of significant counter evidence? Can you think of any examples?
Notice that this is a bit different than what Huemer is trying to teach us, i.e. basic bias. For example, let's assume that Huemer is an expert on bias and that he has a special power to be aware of all of his biases. But one day, he decides that he wants to indulge a particular bias in order to obtain a certain outcome in his life. We might call this purposeful bias or enlightened bias. Why might he do this?
I think what I am describing is Baysian reasoning model. We start with a prior and go along updating as we encounter new information. If we build up enough information confirming of a certain point of view then we can say we have a “strong prior” (for example that my father is a moral guy with high integrity). If there is a strong prior, then it would require very strong contradictory evidence before a Baysian reasoner would update to the new point of view. This is identical to saying the Baysian reasoner has a bias toward the former view. My point is just that such a bias is not irrational or un-objective. In fact bias can (and should!!) be part of a rational process of knowledge-seeking, so long as we understand our biases and respond to new information in a way that deliberately takes into account our existing biases (priors).
(I could add that it is also important that we don’t build up our priors based on pure emotion or other logically fallacious sources, but I think that goes without saying.)