Despite how it may feel sometimes, most of your colleagues understand the value of high quality data. They also value the insights data teams are able to deliver to help them to perform their role better. When it works, they see the value. Despite this, many offices today are a home for unloved and neglected Power BI Dashboards. You built it, it looks beautiful and then they did not come.
The reason is simply that data doesn’t solve all problems. In business, context matters and dashboards can be terrible at delivering the why or the how. It’s visualisation without the story. Intelligent decision making doesn’t need anyone to choose between the data and their gut instinct on a topic. It requires mastering the integration of both - and by the end of this article, we’ll give you a strategy to find the optimal blend!
The Power of Pure Data: When to Trust the Numbers
There are obvious times when looking at the world mathematically just gives us another, sometimes more insightful, perspective on the situation around us. More perspectives often mean better decisions.
One of the earliest machine learning projects I was involved with looked for patterns in an online cosmetic retailer’s data. I found one! There was a large group of women who bought lipsticks for their mothers. I don’t know about you, but I would never have imagined that was a customer segment. Everyday became Mothers’ day for the marketing team after that.
Before I moved into data analytics, I used to be the subject expert brought in as a consultant and I know first hand how much value the right data process can add to expertise. I worked in educational assessment, and there is a whole specialised field within statistics for assessment analysis techniques, refined over an entire century. Writing a fair exam needs extreme attention to detail, and the bigger picture can easily get lost. Those algorithms could spot exactly where a problem had occurred. Once flagged up for attention, it was easy for me to identify why it had happened and start the job of fixing it. The data analyst was a valued member of the team!
Sometimes the data changes and people can be very slow to realise. Take Dubai chocolate - that viral hit of 2025. If you were a chocolate manufacturer, keyword monitoring could have notified you of a 100x spike in the term ‘Dubai chocolate’ after just a couple of popular videos. You would need to wait until a video made its way into your feed. It would give you the chance to get ahead of your competitors early in the year, producing a pistachio-based chocolate offering before public awareness peaked some time around the summer.
These examples highlight the strengths of data analytics. They can handle scale, like vast customer data sets, supremely well. They can see patterns in places that defy human inspection. Data analytics also excel at not getting distracted or tired. They can apply a level of objectivity sometimes. They also accept the fact that things change, sometimes rapidly. In my experience, data analytics were capable of removing personal bias from the assessment process, and that in turn created fairer, better tests.
Where Data Fails
One of the challenges with data is that it is context blind. That means that the data analytics process was able to say where there was a problem with my tests, and sometimes it could also handle what had happened - a question was too difficult or too easy, or too many people seemed to be guessing it. But it couldn’t tell me why that problem was occurring and it certainly didn’t know how to fix it. That’s where my years of training in assessment were needed.
Another major issue is the quality of the data. The expression ‘Garbage In, Garbage Out’ (GIGO) is a perennial truth of data analysis. There is no algorithm that can fix poor quality data collection and maintenance, and relying on flawed, incomplete, or incorrectly measured data can, in fact, be worse than turning to your gut instinct. You may feel encouraged and emboldened knowing that your decision is backed up by ‘facts’.
Another major problem is that it often can’t predict the future very well.
Rare events are very common!
Data can’t ever predict truly new, disruptive events or innovations. Ironically, rare events happen much more often than we think! Given a large enough population and a long enough time frame even an event with a tiny probability is almost certain to occur eventually. This is the Law of Large Numbers. Unexpected events happen all the time.
In analytics, we’re not very good at including rare or unexpected events in our models. I could monitor the impact of a selection of, say, 1000 rare events, and say how they will impact a business. But predicting which of those 1,000 events will happen next year is hard. It’s a level of risk management few businesses see producing a return on investment, unless an expert has suggested that a particular risk or opportunity is growing.
Now, add into that the problem that ‘machine learning’ means that the model was trained on historical data, and you start to see how machine models can quickly become outdated. It takes time to re-train models and get them to produce up-to-date information.
The Essential Role of Expert Intuition (The Qualitative/Human Argument)
Intuition is not magic. It’s pattern recognition built over years of experience. All of us are able to process information subconsciously before our thinking brain has caught up.
Speed is one of the crucial benefits of expert input, and in fact, in many business operations, rapid, complex, high-stakes decisions are the norm, and there is no time for deep analysis.
For example, before any professional football match, the data analysts will advise managers and players what attacking patterns are favoured by the team they are about to face. But once the game has started, players have to go through fast-based processing. They need positional information about themselves, their team mates and the opposition to make a rapid analysis, anticipate the next moves, and simultaneously execute an action. Blink and you’ll miss it.
Another reason expert intuition is essential is because context matters. Before looking at any data on student learning, I will always have a list of questions about the theory of education the data team is working with, or the assessment paradigm they’re using. I can guarantee that if there is a misalignment between the assessment paradigm and the data collection process, the results will be unhelpful or obviously a bit ‘off’. You need expert reviewers to know what data to ask for in the first place, or to spot assumptions that should never be made.
Finally, a lot of things just can’t easily or accurately be measured directly, but that doesn’t stop us trying! Things like team morale, happiness and market sentiment are important, but hard to measure. We very often have a thing we want to measure - Do our customers like us? We measure the only thing we can see in the data. What do customers who write reviews online say about us? It may be that your best customers are the ones who wander in and out of your store, never sign up to your loyalty card project, and leave no data trail behind. I bet your store staff will know them by name!
We’re only human after all
In all honesty, algorithms now outperform human decision making in all sorts of arenas, from stock market picks to sorting out congestion in our cities, on aggregate. Often, we confuse genius with getting lucky. The main reason machines are starting to solve problems more is poor thinking. Humans can be swayed by emotions, fatigue and cognitive biases. Here are the biggest culprits:
1. Confirmation bias
Confirmation bias is the natural human tendency to prioritise or seek out data that aligns with an existing viewpoint. It explains so much social media behaviour! We are really slow to accept counter-arguments that go against our way of viewing the world. Add to that the fact that many people on social media are delivering their opinions on ‘wicked problems’ - problems that are complex, and difficult to solve because they have incomplete, contradictory and evolving requirements - and you have a recipe for bias. Confirmation bias affects all our decisions, big or small.
2. Recency bias
Another particularly enduring and pervasive challenge is the recency or availability heuristic. Managers conducting performance reviews will tend to give disproportionate weight to an employee’s work in the last one or two months, rather than their entire year’s work. Candidates are more likely to get hired if they are one of the last candidates to be interviewed! Lawyers know the power of closing arguments to swing decisions in their favour. Allowing the most recent or emotionally impactful event to disproportionately influence a decision is a common cognitive shortcut.
3. The success trap
Take a moment to picture yourself achieving your dreams. I can bet one thing - the image in your head places you at the top of the mountain. We rarely invest the same amount of time creating clear images of the climb. We imagine that a past success is easy to replicate, and contradictory or surprising data is much easier to dismiss when we have a long track record of doing well. It encourages us all to rely too heavily on past methods and ignore the very things that might derail our plans. It’s why risk management has evolved to require specialised focus.
Creating the Optimal Blend
The good news is that there is no need to choose between using data or your gut instinct. When the two can work harmoniously together, you will make better decisions. Here is how to combine them.
Step 1: Get Data to Check your Gut
If your gut tells you to go a certain way, look for the minimum data required to validate or challenge that gut response. For now, no matter how successful you have been in the past, treat your intuition as a valuable hypothesis. It might be true, but there might be some new data to show it was not a valid hypothesis
Step 2: Let your Gut Instinct Check the Data
When data yields a surprising or counter-intuitive result, pause. Use expertise to question the methodology, context, and assumptions behind the data set. If a large number of people in your survey reported being over the age of 100, you would stop and question if something had gone wrong with the way age was being calculated. Do the same with any number or pattern in data that looks wrong. I can assure you that errors in data analytics processes are very common!
Step 3: Solve your Problem with Gut and Data working in Tandem
At some point a decision needs to be taken. Always begin with data if possible, but then let expertise guide the interpretation and implementation of the solution that emerges. Use expert judgment to interpret complex, high-uncertainty data (e.g., market entry decisions).
Step 4: Be prepared for Rapid Course Corrections
Still with the attitude that you are working with a hypothesis about what will be effective, monitor what is happening. Use your gut instinct for quick course corrections, then use data to measure the effectiveness of that correction afterward.
Mastering the Synthesis
Many people are concerned about the future. I don’t share their fears. There is no shortage of problems in this world to solve, and the best leaders and strategists will always be in demand. They do not need to choose one side. The best leaders actively use both in a constant feedback loop. You can apply this framework to ask ‘What does the data say?’ AND ‘What does my experience feel is missing?’ With those two working together, you don’t need to be afraid of making decisions. The Optimal Blend is less about minimizing risk and more about maximizing the informed confidence of the decision.

