We have observed how one can use irrefutable facts to reach the precisely wrong conclusion.  It happens when we assume away real personal biases, emotionally attach to models and narratives, or are blinded by a delusional sense of moral superiority.

But can the opposite be true? Can one use fuzzy ideas, even if filled with rational and emotional bias, to reach correct conclusions or solutions that work remarkably well?

I suppose the answer is yes. The difference is the distinction between rational and rationalization.  Often the first case is an action of rationalization where the irrefutable facts are selected to reach the obvious conclusion which is sometimes wrong.  This is most obvious when we encounter an overwhelming consensus in a problem with a large number of complex variables. Rational can be the opposite of rationalization.

In our obsession with numerical data to support policy decisions, we ignore or obscure fuzzy ideas which are yielding superior results. It may pay to follow a successful solution we do not understand than to insist on a data driven prescription that we delude ourselves to accept as an irrefutable outcome.

Philosophies sometimes form to explain how and why the successful ideas worked after the fact.

In my experience bad decisions have common starting points.  The first is an arbitrary deadline.  In the modern age, business moves at an incredible speed and delays can be deadly.  But not every decision is a life or death decision and this drive to make every decision existential derives from a delusional sense of urgency, often to serve a leader’s fragile ego. In politics ruling parties fear the ever shifting winds of election cycles that will close the window of opportunity to pass important regulations and laws.  It is worth noting how many small problems seem to solve themselves if given the time.

Easily reversible decisions should be made relatively quickly, but irreversible decisions should be well thought out, and can be served from diverse input, or from the single perspective of a true visionary leader.  But such rare perspective usually comes from a period of experience and study that does not necessarily correlate with age.  There is a distinct difference between 10 years of experience and one year of experience repeated ten times.

Fear and greed can incite speed and recklessness.

Another enabler of bad decisions is moral supremacy.  When we think we are on a moral mission we find it easy to dismiss dissenting voices.  Worse, when we demonize an opposition, especially in the light of a recent failure, we often fail to learn the lessons of their failure and stand to repeat them. It is much easier to discredit and criticize as incompetent or evil than to recognize and understand the thought process and views that led to the moment.

Moral supremacy may lead to immoral outcomes. To quote C.S. Lewis,

Of all tyrannies, a tyranny sincerely exercised for the good of its victims may be the most oppressive. It would be better to live under robber barons than under omnipotent moral busybodies. The robber baron’s cruelty may sometimes sleep, his cupidity may at some point be satiated; but those who torment us for our own good will torment us without end for they do so with the approval of their own conscience.”

There is evil in the world, and it usually persists until a greater force overwhelms it.  But we excessively demonize others among us that share many of the same values.  Russel Jacoby in Bloodlust, noted this as the narcissism of minor differences.  Examples are the Protestant /Catholic conflicts in Northern Ireland, the Sunni/ Shiite conflicts in the Middle East and the partisan conflicts in this country.

We may be unable to lose the human frailties that affect our conclusions and decisions.  We can only try to overcome them with the virtues we also possess, starting with humility.

print