STARTUP SCIENCE, I.E. EVIDENCE-DRIVEN
Those of you who are regular readers of Pollenizer’s blog will recognise our belief in evidence-driven business model discovery – something we call “Startup Science”.
Startup Science uses hypotheses about customer problems to run experiments, building a base of evidence to reduce risk, identify unique value propositions, reveal underserved market niches, and – in the ideal case – discover entirely new business models. We’ve come to believe in running experiments so much, we even do it to help find balance and growth in our professional and personal lives !
Going back to first, philosophical principles, we are of course using the scientific method here – famously described by Nobel Prize-winning physicist Richard Feynman (1918-1988):
“In general we look for a new law by the following process. First we guess it. Then we compute the consequences of the guess to see what would be implied if this law that we guessed is right. Then we compare the result of the computation to nature, with experiment or experience, compare it directly with observation, to see if it works. If it disagrees with experiment it is wrong. In that simple statement is the key to science.
Paraphrasing Feynman, we:
Hypothesize: Make a guess about what we believe is true.
Expected Behaviour: What we expect to see happen if our hypothesis is correct.
Experiment: How we will observe a real situation.
Result: What actually happened.
Conclusion: Our hypothesis was right or wrong.
The scientific method – and its reliance on the principle of an underlying, provable, objective reality – is so simple, so self-evidently logical – you can be forgiven for thinking its a widely accepted way of thinking.
In fact, recent political events have provided the worrying spectacle of political leaders winning office by doing the exact opposite: literally questioning the very notion of objective reality.
“ALTERNATIVE FACTS”, I.E. COGNITIVE BIAS
When US Presidential counsellor Kellyanne Conway uttered the oxymoron “Alternative Facts” in defence of her colleague Sean Spicer’s demonstrably false claim that “…the largest audience ever to witness an inauguration” was in Washington for President Trump’s swearing in, she set off a firestorm of controversy in the press and online. Pundits and opponents alike claimed it as proof the incoming administration would prefer to lie about and/or ignore verifiable facts that do not conform to their world view, literally ignoring objective reality.
Unconsciously denying facts because they go against your own strongly held beliefs is known as self-delusion, or Confirmation Bias, perhaps the best known of a family of cognitive & behavioural biases that – if we’re not careful – can distort our ability to objectively analyse information and use it to make decisions. (Consciously denying facts falls into another realm: “Public Relations”.)
Author and psychologist Kendra Cherry has listed some of these biases, which include the following, scarily familiar examples:
- The Anchoring Bias – being overly influenced by the first piece of information that we hear;
- The False Consensus Effect – overestimating how much other people agree with your beliefs, behaviours, attitudes, values;
- The Availability Heuristic – our tendency to estimate probabilities based on our own experiences.
Can you think of people you know who might exhibit the above biases? Welcome to the Actor-Observer Bias, i.e. that in our role as observers, we don’t believe we are personally susceptible to these biases, but we can definitely think of others who are!
How do we not delude ourselves about the results of our experiments when they disagree with our innate biases ?
FEYNMAN TO THE RESCUE
Our guide to the scientific method, Richard Feynman, understood cognitive and behavioural bias. He warned us about hubris, so:
“It does not make any difference how beautiful your guess is. It does not make any difference how smart you are, who made the guess, or what his name is – if it disagrees with experiment it is wrong. That is all there is to it.”
He warned about both the Anchoring Bias and the Availability Heuristic :
“The first principle is that you must not fool yourself – and you are the easiest person to fool.”
And – in his conclusion to the report on the Challenger Shuttle disaster – he called out Confirmation Bias, which led unqualified managers to ignore the objections of engineers who knew that the Shuttle solid-rocket booster O-rings would leak at the low-temperatures forecast for the next morning’s launch:
“For a successful technology, reality must take precedence over public relations, for Nature cannot be fooled.”
BE LIKE A SCIENTIST
Dealing with biases is hard – which is why real scientific progress requires extensive – and adversarial – peer review.
As founders and intrepreneurs using Startup Science methods, we need to be mindful about how we form hypotheses, design experiments, interpret the results, and derive new hypothesis and experiments from those. We need to critically examine the data we collect, and review with our co-founders, peers, coaches and mentors to ensure we’re seeing and interpreting results clearly.
Confirmation Bias is particularly insidious, manifesting itself as a tendency to seek information that confirms our pet beliefs.
We need to be as self-aware as we can be – equally curious and brutal – so we can mitigate our in-built cognitive and behavioural biases, making it more likely that we realise the benefits of the scientific method in our professional entrepreneurship practice.
Hack, hustle and flearn!
Partner & Startup Scientist
Tim Parsons is a Partner and Startup Scientist at Pollenizer, responsible for leading our Launch/XO Startup Accelerator/Incubator practice. An experienced tech exec who has worked around the world, Tim is now focused on deep-tech areas such as Energy, Health, Food/Agriculture, AI and Space where new, exponential-scale opportunities for entrepreneurship abound.