Today’s post on ethics is written by Alonzo Fyfe of Atheist Ethicist. (Keep in mind that questions of applied ethics are complicated and I do not necessarily agree with Fyfe’s moral calculations.)
One of the facts of life that always disturbs me is how little I know, no matter how much work I put into changing that.
Often, I see this recognition of ignorance and fallibility as a handicap.
I constantly hear and read people who pretend to a level of uncertainty they cannot possibly justify. However, they are certain they are right, so they can step in front of a camera and assert their beliefs with certainty. People listen to them and absorb that certainty. “I am certain that I am right because Glen Beck said it and he is certain he is right.”
On the other side of the equation, you have people thinking about certain claims and adopting options on those issues, but who are constantly questioning themselves. They ask, “What if I am wrong? What if I am leading people down the wrong track? What else do I need to know that I haven’t already studied?”
For example, what does it take to render an informed opinion on the deficit? The war in Libya? The safety of nuclear power? Global warming?
The scientist (and any morally responsible speaker) says, “The evidence so far suggests to me that ‘P’, but, of course, I could be wrong.”
He then stands against the person who confidently asserts, “I am absolutely certain, beyond any possibility of error that ‘not-P’.”
The public then weighs the first person’s hesitancy over asserting P with the certainty of not-P. Many conclude that certainty trumps uncertainty. Therefore, they adopt not-P. At which point many also adopt the proponent’s arrogant contempt for anybody who would possibly assert the obviously false claim, P.
I have written before about how we simply do not have the time and resources to hold all of our beliefs up to the light of reason. By necessity, we have to take shortcuts that are fallible but less resource-intensive. This tendency to adopt degree of certainty as a indicative of truth may well be one of those “shortcuts.”
Yet, the possibility that this type of explanation might exist doesn’t eliminate the problems that this causes. It doesn’t change the fact that we have a culture that adopts a lot of fictions arrogantly asserted as fact, and a failure to adopt facts put forward under the qualification that it is the best supported position based on available evidence to date.
But what is the remedy to this? Is it for the people in the second group to pretend to a certainty they know is unfounded?
Besides being dishonest, it requires shouldering a tremendous amount of responsibility. Pretending to have certainty as a way of convincing others does not simply erase the moral responsibilities that come from the possibility of error.
This touches on a discussion I often hear that accuses scientists of being at fault for scientific illiteracy in certain countries and among certain populations. The accusation is made that this is due to the fact that the scientists are poor communicators, and the scientists should become better at that craft.
However, it seems that this accusation should come with evidence of what it takes to be effective communications. What if successful communication, all else being equal, means adopting a pretense to certainty, name-calling and making derogatory overgeneralizations against those who disagree, and promoting a tribal ‘us’ (scientist) versus ‘them’ (faithist) tribalism?
What if failure to do these things, all else being equal, means your claims will be adopted only by a small percentage of the population?
These, themselves, are scientific questions. Yet, these are questions in which a lot of people make bold assertions that they know what the answer is without any actual investigation into the subject matter they are asserting.
I have been trying to write posts recently pretending to a level of certainty I cannot justify. I have tried to do so out of the sense that admitting uncertainty is a social flag that says, “Don’t listen to him. Listen to the critic who arrogantly insists that he cannot be wrong.” However, I find myself frequently coming to a point where I cannot justify a claim made in certainty. And I have trouble getting past that point and going ahead with the claims.
This is where I think, for many people, faith enters the picture. Faith is a psychological tools that gets an agent past that point – allowing him to remain blind to such questions and make their claims with absolute certainty. It also allows him to remain blind to the potential harms that come from being wrong, and the moral responsibilities that those harms give rise to.
Here, faith and arrogant certainty become almost indistinguishable. ‘Faith’ is the psychological tool that makes arrogant certainty possible, and arrogant certainty is the psychological tool that tempts people to see the ‘faith’ response as useful.
Well, if there is any truth in this, it would suggest that a possible route to take is not to berate the scientists for being poor communicators. It is, instead, to berate those people who do not view the communication of science as effective. Failing to be persuaded by scientific arguments, to the degree of certainty (and uncertainty) that science allows, is indicative of morally irresponsible arrogant uncertainty in which “faith” is often used as an excuse.
Maybe it is not such a bad thing to recognize our human fallibilities and be a bit hesitant at times in the conclusions we assert. Maybe the fault lies, instead, in not being such a person.
- Alonzo Fyfe