Decision Making: how to make better decisions and predictions
December 29, 2019
This is the second of two posts on Decision Making and Team Organisation and Behaviours.
Here's I'm interested in one-off complex decisions like: should we invest in building product X? Or, Is it a good idea to merge team A with team B? So I've been reading evidence based research on the tools and techniques that lead to better predictions and decision making. I found that it is possible to get better, here's how:
General Rules from research literature:
Routine decisions should make use of quantitative models and/or checklists. If you're going to make a repeat judgement every month (or every hour) you should have an automated framework for it. For everything else... read on:
Spend time to clarify the problem and make sure you continuously gather information and adjust your views. Keep iterating on the framework and criteria for the decision. Both deliberation time, and willingness to change one's mind positively correlate with more accurate judgements.
Decompose the decision and make your criteria explicit. Take the example of chosing between two jobs, dont' just ponder about the pros and cons of each. Write a list of what matters in a job (e.g. compensation, commute time, fulfilment, chance of success, etc). Only after you know what you care about, can you decide.
If comparing alternatives, use a basiline. Again with the job example: it's hard to score "commute time" in a job from 1 to 5, but it's easier to start with what you have now and compare if it would increase or decrease. So mark a baseline as "0" score, and rate the others in relation to that (positive or negative).
Answer the following questions: "What don't I know?", "To make this decision, what would I like to know?" and "What are our assumptions?". This opens up the possibilities for information gathering at a wider scale and calibrates our confidence.
Explicitly consider multiple alternatives and outcomes. Rank order alternatives. Research shows being explicit about alternatives is a strong predictor of good decisions. In my professional and personal life I see many people making decisions on the basis of "Is X good?" or "Should I do Y?". The point of course is always: "compared to what?!". Always be explicit about alternatives and think of opportunity costs.
Use Probabilistic Reasoning, put a number on things. For three reasons: (1) It can be done and it works. A lot of research has shown that putting probabilities on events leads to more accurate judgements (2) It avoids ambiguity: "could/might/may" mean different things to different people. (3) It makes it easier to aggregate judgements from multiple people compared to aggregating words. (what's 2 maybes + 1 perhaps + 1 probably ?)
Compare your situation with similar examples - use Reference Classes. Here's an example: You know that the final cost for building a football stadium is on average 500M and takes 1 year to complete; then your city's mayor says they'll build one for 80M in six months. Does this seem likely? Different example: you hear a CEO saying they'll quintuple the company's revenue in 3 years - what should you ask? Before you ask to see their numbers, you're better off asking "is there any example of a similar company doing the same?"
Have multiple decision criteria. This brings out hidden assumptions, clarifies what's important, and invites people with different perspectives. Profit may be a criteria, but what about the impact on retention or customer perception? This also factors in the concerns of stakeholders whose support may be important.
Consider a wide range of stakeholders and deal with the "politics" of the situation. Decisions can have implications for others that you don't foresee. Getting their support and buy-in matters. Further, there will be groups and individuals whose interests and motivations may be in tension with your own. This is normal and ignoring it will lead to worse outcomes.
Crucially, every research on individual training that I've seen is clear about two things: (1) people who test higher for cognitive ability and open-mindedness make better predictive judgements, and also (2) everyone can successfully learn and train in the bullet points above to make better decisions and have more accurate forecasts.
Recommendations from Intelligence Analysts
The USA's intelligence community has put a lot of money into research for improving processes for aggregating information and making decisions. I summarise some of the main points they make:
List key assumptions, scrutinize them for how confident you/team are about them.
Search for new data and evaluate data quality.
Be explicit about alternative outcomes and alternative explanations for an observation
Have a Devil's Advocate and explicitly assign the role. If the issue is important enough then have separate teams pursuing different approaches.
Map out several potential outcomes for each potential choice/decision. Engage in scenarios.
Decision making should be transparent - this invites accountability and incentivises better judgement.
Similar to the academic research. Two different emphasis are the "Devil's Advocate" role and the "Scenario planning". Perhaps because of the high stakes.
In Conclusion:
We can improve the quality of our decision making if we use the techniques and tools above and train ourselves. Think probabilistically, be explicit about alternatives, question assumptions, aggregate information from multiple people and sources.
As I wrote before: it's hard to generalise in this field, but the alternative is to do things that have no evidence. So in my teams and organisations I'll advocate for the points made above.
Refferences:
The US government's Intelligence Advanced Research Projects Activity (IARPA) programmes on decision making: ACE, FUSE, CREATE, ForeST and SHARP.
https://www.sas.upenn.edu/~baron/journal/16/16511/jdm16511.pdf
https://www.aaai.org/ocs/index.php/FSS/FSS12/paper/view/5623/5867
http://journal.sjdm.org/19/190925a/jdm190925a.pdf
http://journal.sjdm.org/18/18919/jdm18919.pdf
http://journal.sjdm.org/14/14411/jdm14411.pdf
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.720.6264&rep=rep1&type=pdf
http://d1c25a6gwz7q5e.cloudfront.net/papers/download/122011_american_psychologist.pdf
https://www.pnas.org/content/pnas/112/50/15343.full.pdf
http://www.erim.eur.nl/fileadmin/erim_content/documents/CorporatePredictionMarkets-Oct1-FirmX.pdf
https://www.apa.org/pubs/journals/releases/xap-0000040.pdf
Daniel Kahneman's book "Thinking, Fast and Slow"