Most software engineers, architects, and developers consider themselves rational, data-driven decision makers. But what if, hidden in your subconscious thought processes, were cognitive biases that influenced your decision-making in ways that weren’t good for whatever project you’re working on, the success of your team, or your career?
Well, that’s a topic Ian Varley, Principal Architect at Salesforce, explores in his Tech Talk, How to Overcome Cognitive Bias in Your Software Architecture. Ian heads an architecture strategy team that aims to clarify and simplify the software architecture of Salesforce.
Before we go any further, it’s important to define what we mean by the term “bias.” We’re NOT talking about a prejudice for or against something, although that’s a common usage. A cognitive bias is a systematic error in thinking that affects the decisions and judgments people make. And over the years, research in psychology has uncovered hundreds of these biases.
Heuristic Versus Algorithmic Thinking
As brilliant as humans can be, our brains don’t operate like computers. Computers use algorithms to solve problems. Now, humans can program algorithms, but we don’t generally use algorithmic thinking. Instead, we use heuristic thinking. A heuristic is a mental shortcut that allows people to solve problems and make judgments quickly and efficiently. But, because they’re really “rules of thumb” rather than complete solutions, heuristics all have “blind spots” — situations where they don’t actually yield the right (or most rational) answer. Cognitive biases are examples of heuristics gone off the rails.
From the hundreds of known cognitive biases, Ian has chosen 10 that can negatively affect software development and grouped them in three categories:
- Getting Stuff Done
- Disagreeing
- Learning Lessons
We’ll examine one from each category.
Getting Stuff Done: The Sunk Cost Fallacy
The Sunk Cost Fallacy says that when you’ve invested time and energy into something, you’re more likely to value it highly, compared to something that you haven’t invested in. It’s why we tend to keep some projects going long after the time when we might logically decide to move on.
For example, let’s say you’ve spent the last year creating an amazing scalable data storage system from scratch. When someone else suggests that perhaps you should stop working on it, and instead maybe contribute to an existing open source project like Apache HBase, you’re likely to (subconsciously) sabotage that possibility — say, by coming up with lots of reasons why your project is really unique and different, or why other projects are of low quality. But if the same project belonged to someone else, it’s likely that you’d be able to dispassionately judge the relative quality of the projects, and decide in favor of the collective project rather than the one that a single person dumped a lot of time into.
Disagreeing: The Common Belief Fallacy
The Common Belief Fallacy is the idea that just because everyone agrees to something doesn’t necessarily make it more likely to be true. So, for example, if everybody on the team agrees that a microservices architecture is the right one for a project, does that mean it’s definitely the right way to do it? Maybe — or maybe not. That’s why the answer to nearly every question in software development is, “It depends.”
Of course, sometimes the majority opinion is absolutely reliable. Certainly, if you have many contributors with experience, who have thoughtfully and seriously considered an idea, the majority opinion is probably the way to go. But remember, especially in technology there’s an “avalanche” affect where individuals say to themselves, “I haven’t really given this much thought, but everyone else thinks it’s good, so I’ll pile on.”
Learning Lessons: Hindsight Bias
Hindsight bias — also known as the “I knew it all along effect” — says that after something happened, you’re much more likely to think you know why it happened. Or even more insidious — you’re more likely to think you knew it would happen. Most of us are guilty of this; we look at events in the past as having been much more predictable than they were.
A good example: Common Object Request Broker Architecture (CORBA). It’s easy to look back from our vantage point today and say it was obvious that CORBA wouldn’t work at internet scale, because of fragile assumptions about systems being highly available to make distributed transactions work. It’s obvious now, in hindsight. But was it really back then? No, because back then we didn’t know what we do now about how systems work at internet scale.