Fall in Love with the Problem, Not the Technology
I’m furiously writing this post right now, as it’s Monday morning and I normally complete my article for Monday the Thursday before! Alas, my week crept up on me and thus I’m in a bit of a race against time.
I’ve been in the working world now for the last 27 years, and I’ve been through pretty much every new tech from ERP, CRM, the data warehouse, big data, visualisation tools, Hadoop, cloud data warehouses, data lakes, lake houses, data mesh, data fabric etc. etc. etc.
WOW! You would think that the technology world is filled with distractions. Every day there’s a new tool, a new platform or concept emerges, or algorithm promising to revolutionise your organisation as we have seen with LLMs. For many data and technology professionals, it’s always tempting to jump straight to the “how”, running POCs, building systems, implementing tools, and hopefully deploying those solutions. That’s fine right, but the challenge I have with many of these approaches is this, few stop to ask themselves an important question: “Why” are we doing this?
Look s around you, this no doubt has happened on your shift, if you are in a position of leadership, or in a team where you are feverishly going to data conferences and loitering with intent at all the vendor booths, only to be shepherded through the sausage machine! Out the other side of those conversations is an output, not an outcome.
The shiny new toy is heralded just as Simba was anointed Musafa’s heir!
The sad news is that many companies fall into this trap far too often, and generally, my belief is that if you are focused on the shiny new toy, you’re solving for irrelevance. Technology is just a tool, a means to an end, not the end. I’m sure you have heard many say that before, and I’m saying it again!
But what should you be focusing on then?
Segue to the title, it’s the problem that should captivate your attention. Falling in love with the problem, not the technology, is the key to creating real, sustainable value.
Is there a Problem with the “How”
Yes, there is, and it’s this. Many organisations will prioritise implementation over understanding. It’s quite common to see many falling into this trap. This is how it normally progresses:
Person 1: “Hey, I just went to said Tech / Research firm conference, and everyone is talking about X new tech”
Person 2: “Oh really, no way!”
Person 1: “Yes way, let’s roll out this new tool; because I’ve hears everyone’s using it!”
Person 2: “Okay let’s do it”
Okay, yes, that is probably far too long a conversation! Ha!
But I’m sure you have heard this many times.
What are the consequences of this approach?
Here are some examples: unused dashboards, POCs never getting out of the sandbox, wasted resources (people, time, money), and leaders asking why their investments aren’t translating into results.
Sound familiar?
This happens because we forget the “why.” We focus on the mechanics of how something will work, whether that’s an LLM model or an AI chatbot, without understanding the context of the problem we’re trying to solve for. The result? Technology becomes a burden, not an enabler.
Start with the “Why”: A Value-Centric Framework
I know we talk a lot about “value” and I’m coming at it from a slightly different angle, this time to offer some kind of framework that will limit the panic buying. In my new podcast episode, which is likely to be out this week, I spoke to Malcolm Hawker, and we spoke about “mindset”, and discussed how data and technology teams need to shift their mindsets.
Well, I’ve decided to put this framework forward as a way to shift the thinking as it reminds folks that we need to put the problem right back where it should belong, front and centre. I know that this isn’t a new framework and I’m sure there are many out there, but I’ve attempted to simplify it as much as I can. I’ve also offered some vital questions that you and your teams can ask:
1. Clarify the Problem
What is the real challenge/issue/problem? Strip away assumptions and get to the root of the problem. This is not about collecting requirements; it’s about deeply understanding what’s broken or missing, and that’s the difference. I’ve seen many analysts go off in searching for requirements, not problems.
Guiding Questions:
- What specific behaviour, outcome, or result is not happening that should be?
- Who is experiencing this problem, and how does it affect them?
- If we could solve only one thing in this area, what would it be, and why?
- Is this problem rooted in process, people, data, technology, or something else?
Example:
- Bad example: “We need better reporting on sales.”
- Good example: “Why are our conversion rates stalling despite higher traffic?”
2. Understand the Stakes
This part delves into why the problem matters, to whom and why should we solve it and what happens if we don’t. Again, this is the simple way of really peeling back the onion and focusing on the value conversation, rather than a tech one. If you don’t understand the stakes, you risk chasing solutions that no one values or cares about for that matter.
Guiding Questions:
- Who cares most about solving this problem, and why?
- What’s the tangible cost of not addressing this problem (e.g., revenue, time, morale)?
- What opportunities will this unlock if we get it right?
- How would we explain this problem, and its importance, to a customer or shareholder?
Example:
- Low stakes: “We want a flashy dashboard for team meetings.”
- High stakes: “Our customer complaints have doubled in the past year, and we don’t know why.”
3. Measure Success
I think this is one area that often gets missed, and for the life of me I don’t know why. Maybe it’s because many vote to run down the vanity metrics, which makes them look good, but communicates naff all about the outcome. That’s why we must define what success looks like before any technology or data is touched. Let’s get this straight, success isn’t “a working dashboard”. It’s measurable outcome like “a 10% improvement in customer retention.”
Guiding Questions:
- What would success look like in plain business terms, not technical outputs?
- How will we know, objectively, that this problem is solved?
- Are there secondary benefits we can measure alongside the primary outcome?
- What does failure look like, and how will we avoid it?
Example:
- Bad example: “The team will have access to a dashboard.”
- Good example: “Customer satisfaction scores will increase by 15% within six months.”
4. Challenge Solution Bias
As they say “every team has a favourite hammer” and they’ll see every problem as a nail. We must resist the urge to walk around banging everything we see! We need to ask whether the solution truly fits the problem, or if it’s just the easiest or the current “trendiest” option available.
Guiding Questions:
- Are we jumping to a solution because it’s familiar or because it’s right?
- What would we do if this technology didn’t exist?
- Have we considered simpler, lower-tech approaches?
- How does this solution address the root cause, not just the symptoms, of the problem?
Example:
- Bias in action: “We’ll use AI because it’s what everyone else is doing.”
- Unbiased approach: “Does AI genuinely address this issue better than improving our existing processes?”
5. Iterate Ruthlessly
No problem will ever be solved the first time you go at it. When you are solving for real problems, you have to constantly iterate. Keep testing, learning, and refining to make sure your solution delivers genuine value. It’s through iteration you know whether or not this is the right solution. It also gives you the chance to say, “hey it’s not working, let’s not spend any more time or money”, which then helps you to go back and see if you managed to define the problem or not.
Guiding Questions:
- What feedback loops have we built into this process to ensure we’re on track?
- Are we measuring outcomes, or just outputs?
- How often are we revisiting the original problem statement to ensure we haven’t drifted?
- If this solution doesn’t work, how quickly can we adapt or pivot?
Example:
- Static approach: “The dashboard is complete; we’re done.”
- Dynamic approach: “We’ve rolled out the first version, gathered feedback, and made adjustments to better meet user needs.”
Falling in Love with the Problem
Let me get one point straight here, falling in love with the problem doesn’t mean you end up ignoring the technology or data. As I mentioned before, it means you see it as a means to an end, not the end itself. This is where I will come back to mine and Malcolm’s conversation, it’s about being relentlessly curious, questioning everything, and staying laser-focused on outcomes. Changing the way you think and adapting to new situations, ergo changing your mindset.
A Provocative Challenge
So, here’s my challenge to data and technology professionals.
Take a hard look at your current projects. How many of them start with a clear problem and a defined value statement(s)? How many are driven by the allure of a tool or the pressure to keep up with competitors?
If you’re honest, the answers might make you uncomfortable. But that’s a good thing. Discomfort is where growth begins, patterns change, and mindsets change.
Technology is fleeting. The ability to understand and solve meaningful problems is timeless.
Fall in love with the problem, and you’ll never go wrong.