Most data leaders know their costs are rising. What they don’t know is where to cut without breaking critical operations. In a recent LinkedIn Live session, three experts from across the data stack shared practical insights on managing costs whilst maintaining delivery speed.
You’ll discover:
This article summarises insights from our recent LinkedIn Live event, where we brought together Ian Whitestone (Co-Founder, Select.dev), David Jayatillake (VP of AI, Cube), and our founder Aaron Phethean to tackle the real challenges behind rising data costs.
Ian Whitestone brings a data practitioner’s perspective from his time at Shopify and now helps teams optimise Snowflake costs at Select.dev. His insight: “90% of teams should focus on using what they have better before jumping to new platforms.”
David Jayatillake tackles BI cost challenges at Cube, where he helps teams build efficient semantic layers. He’s seen how: “Teams often spend four times more on people than tools, yet scrutinise a £400 monthly tool purchase more than hiring decisions.”
Together with Aaron Phethean, who guides teams through zero-risk ETL migrations at Matatika, these experts shared battle-tested strategies for controlling costs without sacrificing performance.
Here’s what most data leaders get wrong: they assume rising costs mean they need better tools. Our discussion revealed that the problem is often poor utilisation of existing infrastructure.
As Ian pointed out: “Data teams have literally never had to have cost management and control as one of the things under their scope. So everyone’s doing a terrible job of this.” The result? Teams burning through budgets on unused pipelines, oversized warehouses, and inefficient query patterns whilst leadership demands cost cuts.
The pressure is mounting. Finance teams see data costs growing faster than business value, but switching platforms feels too risky. Meanwhile, data teams spend more time firefighting than innovating.
The insight: Your people are your biggest expense, so optimise their time first.
David highlighted this counterintuitive truth: “Even if teams are spending hundreds of thousands on tools, they’re spending four times as much on people.” Yet procurement processes scrutinise tool purchases far more than hiring decisions.
How we see this play out: Teams regularly discover forgotten development environments, unused pipelines, and inefficient sync schedules that collectively drain 20-30% of their data budget.
Expected outcome: Teams become more strategic, spending less time on operational overhead and more on business-critical analysis.
The insight: Most cost problems come from waste, not platform choice.
Ian shared a powerful example: “The first time customers log into our product, they’ll be like, ‘Oh, there’s a dynamic table or DAG that someone left in dev that’s wasting £30k a year.’ They just turn it off.”
This is why we’ve built Matatika around performance-based pricing rather than arbitrary row counts. You pay for infrastructure usage that directly correlates with business value, not technical overhead. How to implement:
Expected outcome: Teams typically save 20-30% on infrastructure costs without touching their core data stack.
The insight: You can’t manage what you don’t measure.
David’s approach at his previous company involved using Terraform to create separate Snowflake warehouses for each team: “We gave each engineering team multiple Snowflake warehouses so we could understand who was spending what.” This created accountability and prevented cost surprises.
How to implement:
Expected outcome: Teams develop cost awareness and self-regulate usage, preventing budget overruns.
The insight: Not all usage-based pricing reflects actual value delivery.
As Ian noted: “Consumption-based pricing—everyone looks at Snowflake’s success and thinks they need this model. But does reloading a couple of rows that updated really reflect the value you’re providing?”
How to implement: How to implement:
Expected outcome: More predictable costs that scale with business value, not arbitrary technical metrics.
The insight: When vendor costs spiral beyond value delivered, switching becomes necessary—but it doesn’t have to be risky.
The experts agreed that for teams spending under £1 million annually, optimisation usually beats migration. But when ETL renewal brings significant price increases without added value, a strategic switch can deliver substantial savings.
This is where Matatika’s Mirror Mode approach eliminates traditional migration risks. Rather than a disruptive “rip and replace,” Mirror Mode runs your new ETL system in parallel with the existing one, validating performance and data quality before any cutover.
Our Mirror Mode approach eliminates traditional migration risks. Rather than a disruptive “rip and replace,” we run your new ETL system in parallel with the existing one, validating performance and data quality before any cutover. As Aaron explained: “You don’t make decisions based on promises—you make them based on proof.”
How we implement this:
Expected outcome: Significant cost reductions (often 30-50%) without migration risk or operational disruption.
Our discussion revealed a clear hierarchy: optimise people productivity, eliminate waste, add visibility, question pricing models, then consider platform changes. Most teams skip straight to the last step when the first four would solve their problems.
What we’ve learned from guiding teams through this process: start with impact analysis. Identify your three highest-cost processes and map them to business value. If you can’t explain why something costs £10k monthly, it’s probably a good candidate for optimisation.
Sequence your efforts by focusing on quick wins first—decommissioning unused resources, optimising sync schedules, and adding cost alerts. These changes require minimal technical effort but deliver immediate savings.
Track success through cost per insight delivered, not just absolute spend reduction. The goal is efficient growth, not arbitrary cost cutting.
David shared how his team achieved dramatic improvements through simple changes: “We had a dashboard in the finance team that was costing hundreds of thousands of pounds a year. I went and parsed the query logs, figured out how to join queries back to dashboards, and aggregated the cost. It wasn’t obvious until we measured it.”
After implementing proper cost allocation and monitoring, they could demonstrate clear ROI for their data infrastructure whilst identifying significant waste. The key was making costs visible and actionable for stakeholders.
From our experience helping teams through similar optimisations: the most telling insight came from Ian’s customer experience—teams regularly discover £30k annual savings just from turning off forgotten development environments. These aren’t complex optimisations; they’re basic housekeeping tasks that deliver immediate impact.
How do I know if my data stack is bloated or just expensive?
Track your cost per active user or cost per insight delivered. If costs are growing faster than business value, you likely have efficiency problems before vendor problems.
Should I optimise first or start evaluating alternatives?
Always optimise first, unless you’re facing immediate renewal pressure. Most teams can achieve 20-40% savings through better utilisation of existing tools.
What’s the biggest red flag that indicates cost problems?
When finance complains about data spend but no one on the data team can explain where the money goes. Lack of visibility always precedes runaway costs.
How do I make the business case for cost optimisation work?
Calculate the opportunity cost of engineer time spent on manual processes. Frame optimisation as “buying back” strategic capacity, not just cutting costs.
Ready to tackle your data costs strategically? Our conversation with these experts confirmed what we see daily: visibility comes first, optimisation second, strategic switching third.
We’ll review your current setup, identify optimisation opportunities, and create a roadmap for sustainable cost management. If switching makes sense, we’ll show you how Mirror Mode eliminates the traditional risks.
Book Your ETL Renewal Planning Session
Stay up to date with the latest news and insights for data leaders.