Most data leaders obsess over cloud costs and platform subscriptions whilst the real budget killer sits right in front of them: talented engineers spending time on the wrong things.
In our recent LinkedIn Live session, we brought together three experts who’ve mastered the art of maximising return from data talent. The uncomfortable truth? Your biggest expense isn’t Snowflake or that new analytics platform. It’s paying senior engineers to firefight instead of innovate.
You’ll discover:
This article breaks down insights from our LinkedIn Live featuring Jack Colsey (Analytics Manager at Incident.io), Harry Gollop (Co-Founder at Cognify Search), and Aaron Phethean (Founder at Matatika).
Data teams are burning money in ways that don’t show up on your cloud bill:
Meanwhile, finance is asking harder questions about data spend whilst business stakeholders demand faster insights. The pressure is mounting, but most leaders are optimising the wrong things.
1. They Hire for Pattern Recognition, Not Just Technical Skills
The insight: Experience isn’t about knowing more tools – it’s about knowing what not to build.
Jack Colsey shared this from his experience scaling at Monzo: “At Monzo, we had a team of 15 data people before we hired our first data engineer. About a month in, I thought we really should have hired this person a year ago. We ended up migrating all our models to dbt, and it was a mammoth effort that would have been so much easier earlier.”
Harry Gollop sees this pattern repeatedly: “One senior engineer at £120k is far more effective than two people at £60k who can create more technical debt down the line.”
How they implement it:
The result: Senior hires ramp faster, avoid common pitfalls, and free up the rest of your team for strategic work.
2. They Eliminate Reactive Work Before It Destroys Momentum
The insight: The biggest productivity killer isn’t complex technical challenges – it’s noisy, repetitive issues that fragment attention.
Jack described their approach at Incident.io: “We focus our weekly planning on what projects have been pushed forward. If you’re ever in a week where you’ve just been doing reactive work and firefighting, it’s a signal that something has to change.”
This connects directly to tool decisions. As Jack explained: “If you’ve got people building custom data feeds and dealing with breaking API changes every two weeks, is that worth your data engineers’ time?”
How they implement it:
The result: Teams spend 70-80% of their time on strategic projects that drive business value.
3. They Measure Impact Through Stakeholder Success
The insight: The best data professionals multiply the effectiveness of everyone around them.
Harry shared his perspective: “How many downstream stakeholders – analysts, scientists, analytics engineers – do they have to come to you when they want new data? Have you decreased that time or made their lives easier?”
Jack emphasised the qualitative test: “You should be able to talk to any stakeholder and ask how upset would they be if you pulled this person away? That’s a really good indicator of someone who helps me think about my biggest problems versus someone who just delivers what I ask for.”
How they implement it:
The result: Data teams become strategic partners with clear evidence of business impact.
4. They Make Tool Decisions Based on Opportunity Cost
The insight: The best teams don’t optimise for features – they optimise for freeing up human time for high-value work.
Jack outlined their decision framework: “It comes down to how much of a solved problem is it? If you’re spending manual time on something that’s well-solved elsewhere, that’s not a good use of your senior engineers’ time.”
The calculation isn’t just tool cost – it’s human cost of the status quo. As Jack noted: “If one person every month is spending time updating five data sources and fixing things that break, that’s a huge portion of a three-person team.”
How they implement it:
The result: Tool decisions become strategic investments that demonstrably free up engineering capacity.
5. They Use AI to Amplify Senior Talent
The insight: AI tools are force multipliers for experienced engineers who can validate outputs and spot problems.
Jack shared their experience: “We’re using Claude Code and Copilot. AI is great at doing those repeatable debugging processes and coming to logical conclusions. It’s like having someone that can help you debug before you even get there.”
Harry added the crucial caveat: “There’s a trap where junior engineers might take AI outputs at face value. Senior engineers can spot when something’s wrong much more easily.”
How they implement it:
The result: Senior engineers become significantly more productive at routine tasks.
Jack described how this approach transforms team dynamics: “Senior people are able to ramp up quickly. They know what patterns work and don’t work. With our current team, I can point them in an area and leave them to deliver.”
The impact extends beyond individual productivity. Harry noted: “If you can make everyone who touches the data stack 30% better at their job, that’s huge value. It’s not just how good you are individually – it’s how much better you’re making everyone else.”
This multiplier effect separates high-performing data teams from those that just keep the lights on.
At Matatika, we see this in action when teams switch from row-based pricing models that punish growth to performance-based alternatives that align costs with actual business value. Teams immediately redirect budget from vendor fees to strategic hiring.
How do I know if my team is stuck in reactive mode? Track weekly planning: if you’re consistently firefighting instead of pushing strategic projects forward, you need structural changes. Aim for 70-80% strategic work.
What’s the real cost of hiring the wrong person? Beyond salary, consider six months end-to-end (opening role, notice periods, ramp time) where that business value isn’t being delivered. For a senior role, that’s often £100k+ in opportunity cost.
How do I measure data team ROI beyond technical metrics? Focus on stakeholder satisfaction and time-to-insight. The best test: how upset would business users be if you pulled a specific person away from supporting them?
When should I replace tools versus optimise what I have? Calculate the human cost of manual processes. If engineers spend more than 10% of their time on repetitive tasks that are well-solved elsewhere, tool changes usually pay off.
How do AI tools change hiring strategy? AI amplifies the productivity gap between senior and junior engineers. Senior people can validate outputs and catch hallucinations. This makes the case for fewer, more experienced hires even stronger.
The teams getting ahead aren’t buying more tools or hiring more people. They’re optimising the intersection of talent, process, and technology to eliminate waste and amplify impact.
Whether you’re hiring your first data engineer or scaling an existing team, the principles remain: hire for pattern recognition, eliminate productivity killers, measure business impact, and use technology to multiply human potential.
Book Your ETL Renewal Planning Session
Ready to tackle your data costs strategically? Our conversation with these experts confirmed what we see daily: visibility comes first, optimisation second, strategic switching third.
We’ll review your current setup, identify optimisation opportunities, and create a roadmap for sustainable cost management. If switching makes sense, we’ll show you how Mirror Mode eliminates the traditional risks.
Book Your ETL Renewal Planning Session
Stay up to date with the latest news and insights for data leaders.