“What are you doing here? The birthday party is next Saturday. That is not to say you are not welcome, but…” This is how one of our eldest friends welcomed my wife and I last Saturday. We appeared at the right doorstep, but at the wrong time. Embarrassed and amused I looked at my wife, who had assured me of the timing. We had both received the invitation email, but I had ignored it after I had seen my wife putting the appointment in our family calendar. When my wife does so, who am I to doubt the accuracy?
Trying to reconstruct the logic of failure1, I later wondered why we gave our old friends a surprise party last Saturday. Obviously, my wife and I both made mistakes. My wife mixed up the timings – these things happen. My mistake, however, was more interesting. I believe I suffered from “authority bias,” which I will discuss in a little more detail below.
In my book, The Strategic Analysis Cycle [handbook],2 I describe the various ways in which humans generate knowledge. Commonly we distinguish knowledge generation by its various sources:
– authority
– habit of thought
– rationalism
– empiricism
Human knowledge is generated through authority when the source of the data that underpins the knowledge is trusted by default. When at a social function you are introduced to someone that shakes your hand and mentions their name, you take it for granted that you now know the person by name. The fact that a person introduces themselves with a name does not normally make you doubt the correctness of that data point. Upon the authority of that person you have generated the knowledge of that person’s name.
What is true for a name may also hold true for an appointment in your family calendar, especially when your reliable and accurate wife has made the appointment. Upon her authority, I knew the birthday party’s alleged timing.
Upon reflection I also recalled the moment that a fellow young scientist colleague had to review abstracts submitted for an upcoming scientific conference. One abstract had been issued by a big name in Chemical Engineering. His abstract, however, didn’t make scientific sense. My colleague was in doubt. Who was he to think the expert could be wrong and he could be right? Confused, he approached our faculty’s own distinguished professor, sharing his dilemma. The professor just smiled.
“So you think he’s wrong, right? Well, that’s correct. It is not the first time he is off. He’s such a big name now, he thinks he can get away with it. I’m glad you don’t see it that way. Reject the abstract.”
Both episodes taught me two lessons:
– trust in authority is good, but thinking for yourself is always better.
– even my wife occasionally makes mistakes.
Erik’s books, The Strategic Analysis Cycle Handbook & Toolbook, are available to buy on LID’s website.
Comments are closed