Posts Tagged ‘ai’

BI Has the Worst ROI in the Modern Data Stack – How to Escape the Service Trap and Drive Real Decisions

Business intelligence is broken. Too many dashboards, not enough decisions. Learn from Count CEO Ollie Hughes how to escape the BI service trap, rebuild trust, and drive real impact through operational clarity and prioritisation.

Introduction

Business intelligence (BI) was meant to be the jewel in the crown of the data stack, the place where numbers become insights and insights become decisions. Yet if you ask most data leaders today, BI delivers the worst ROI in the stack. Teams are drowning in dashboards, executives don’t trust the numbers, and tools that haven’t evolved in 20 years are still eating up budgets.

Ollie Hughes, CEO of Count, argues that the industry has got it wrong. More dashboards aren’t the answer. AI won’t fix reporting chaos. And data teams need to stop behaving like service desks.

This article is based on Ollie’s appearance on the Data Matas podcast and translates his perspective into actionable lessons you can apply right now.

What you’ll learn:

  • Why BI tools are stuck in the past and what that means for ROI
  • How to spot the “service trap” in your team before it kills your value
  • Why accuracy alone doesn’t build trust in data
  • How ruthless prioritisation changes the credibility of a data team
  • Practical steps to create operational clarity instead of dashboard noise

Meet Ollie Hughes

Ollie Hughes is the co-founder and CEO of Count, a canvas-based BI platform built around collaboration, not dashboards. His mission is to reframe how organisations use data: not as a firehose of metrics, but as a tool for genuine decision-making.

He has spent years inside the industry, from building data teams to leading a new wave of BI innovation, and has become one of its sharpest critics. His critique isn’t abstract. It’s grounded in the frustrations most practitioners feel every day: wasted reports, confused stakeholders, and tools that don’t fit how people actually work.

“BI tools are still the same ones we were using 15, 20 years ago. They’re expensive, and all you’re really paying for is to see sales numbers sent around the company. It’s not driving decisions.”

That willingness to say the quiet part out loud, and back it with solutions, makes Ollie an important voice for data leaders rethinking their approach.


Why This Challenge Matters Now

Cloud data platforms, pipelines, and governance tools have all seen dramatic innovation. BI hasn’t. It’s still read-only dashboards that require endless interpretation elsewhere. Meanwhile, every other tool we use, from marketing platforms to document editors, has become collaborative, flexible, and iterative.

The result? Data teams are under pressure to deliver value but stuck with outdated paradigms. Many end up in what Ollie calls the “report factory”: churning out dashboards that confuse more than they clarify.

A common misconception is that more speed solves the problem. Leaders throw AI at reporting in the hope that faster answers mean better decisions. In reality, Ollie argues, the bottleneck isn’t producing numbers, it’s helping humans interpret them and agree on what action to take.

This is why now is the moment to rethink BI. As companies adopt AI at pace, trust, clarity, and prioritisation become the real levers of success.


How to Escape the Service Trap and Drive Real Decisions

1. Recognise That BI Hasn’t Evolved

Most of the modern stack has innovated, BI hasn’t. Dashboards may be prettier, and integrations with tools like dbt smoother, but the core interaction hasn’t changed.

“You read a dashboard, you discuss it somewhere else, and you try to work out what’s going on. That paradigm is still the same as 2005.”

Implementation guidance:

  • What to do first: Audit your BI output. How many dashboards exist? How many are actively used?
  • Tools/structures: Usage analytics inside your BI tool can show adoption and engagement.
  • Watch-outs: Don’t assume integration features equal innovation, the form factor is what matters.
  • Expected benefit: Clear view of which reports genuinely support decision-making.

2. Avoid the Service Trap

Many data teams confuse activity with impact. They answer every request, build dashboards for every stakeholder, and believe they’re adding value. In reality, they’re generating information overload.

“Just doing what the business asks of you floods the company with chaos. If the business is asking stupid questions, the data team is going to be producing stupid answers.”

Implementation guidance:

  • What to do first: Count the number of dashboards per employee. If it’s high, you’re in the trap.
  • Tools/structures: Introduce a request filter or impact sizing model.
  • Watch-outs: Saying “yes” to everything positions your team as a service desk, not a strategic partner.
  • Expected benefit: Fewer but higher-value outputs, stronger alignment with the business.

3. Build Trust Beyond Accuracy

Most data leaders obsess over accuracy. But Ollie warns: accuracy alone doesn’t create trust.

Imagine telling the CEO: “Our regression says move all marketing spend from Channel A to Channel B.” Even if it’s correct, they won’t act on it unless they understand how you got there.

“Trust comes from methodology, transparency, and track record, not just from being right.”

Implementation guidance:

  • What to do first: Make “show your working” a standard practice for every analysis.
  • Tools/structures: Adopt lightweight documentation or visual lineage tools to explain methodology.
  • Watch-outs: Overcomplicating explanations can backfire, clarity beats detail.
  • Expected benefit: Executives who feel confident enough in the process to act on insights.

4. Prioritise What Really Matters

Ollie’s strongest advice: not all requests are equal. Some will change the trajectory of the business; most won’t.

“If you’ve solved the most important problem the business has today and the CEO recognises that, you’ll be remembered for it. That’s what matters.”

Implementation guidance:

  • What to do first: Track where your team’s time goes: maintenance vs problem-solving.
  • Tools/structures: Create a simple payroll allocation dashboard – % time on top 3 business priorities.
  • Watch-outs: Saying “no” is hard, but without it your team’s value will always be capped.
  • Expected benefit: Senior leaders see the team as solving the biggest problems, not just keeping the lights on.

5. Create Operational Clarity, Not More Dashboards

In a world where every SaaS product spits out metrics, the role of the data team is to simplify, not add noise. Ollie calls this “operational clarity.”

“The job is to show the forest, not just the branches. Visualise the business, make it feel simple, align everyone on what matters.”

Implementation guidance:

  • What to do first: Build a single-page growth model that shows how key metrics relate.
  • Tools/structures: Collaborative BI tools (like canvas-based environments) that allow business and data teams to work together.
  • Watch-outs: Don’t replicate existing reports, focus on connections, not duplication.
  • Expected benefit: A business that understands itself better, asks better questions, and makes clearer decisions.

Putting it All Together

The path forward isn’t more dashboards or faster charts. It’s about shifting from outputs to outcomes.

A realistic sequence:

  1. Audit existing dashboards and usage
  2. Filter requests through impact sizing
  3. Make transparency part of delivery
  4. Redirect at least 50% of team capacity to the biggest problems
  5. Replace scattered reporting with a unifying model of the business

Signals of success: fewer but more impactful outputs, leaders asking sharper questions, and a measurable increase in trust in data-driven decisions.


Real-World Impact

Count’s customers have already applied this model. By shifting away from dashboard churn, they’ve reduced noise, improved decision-making speed, and redefined how business and data teams work together.

The result isn’t just cost savings. It’s a cultural shift: data teams that no longer see themselves as report writers, but as strategic partners shaping the direction of the business.


Your Next Move

The lesson from Ollie Hughes is simple: stop measuring success by the number of dashboards you ship. Measure it by the clarity and decisions you enable.

Focus your team’s time on the most important problems, show your working, and embrace operational clarity. That’s how data leaders can turn BI from the lowest ROI into one of the highest.

🎙️ Listen to the full conversation with Ollie Hughes on the Data Matas podcast for more actionable insights.


Dig Deeper

 

How Hypebeast Reached 97% AI Adoption Without Fear or Layoffs

At Hypebeast, 97% of staff now use AI daily not out of fear, but choice. Director of Data & AI, Sami Rahman, reframed AI as a creative ally, not a threat. By focusing on practical wins, like speeding up research and cutting drudgery, he built trust and curiosity. Instead of pushing tools, he created demand through scarcity, measured impact rigorously, and deleted underused agents without sentiment. The result: adoption that stuck, creativity that flourished, and teams that saw AI as empowerment, not replacement. A playbook for leaders who want AI adoption to last built on trust, not hype.

AI adoption is at the top of every data leader’s agenda. Yet most attempts stall. Leaders flood their teams with new tools, staff get overwhelmed, and adoption drops. In some cases, AI is seen as a threat rather than an enabler.

At Hypebeast, it’s different. 97% of staff now use AI agents in their daily work. Not because they were forced, but because they wanted to.

In this episode of Data Matas, I spoke with Sami Rahman, Director of Data & AI at Hypebeast, about how he made AI adoption stick. His story offers grounded lessons for any data leader trying to balance hype with reality.


Meet Sami Rahman

Sami leads Data & AI at Hypebeast, the global fashion and lifestyle brand. His career spans psychology, counter-terrorism, and data science — giving him a rare perspective on how people, systems, and trust interact.

That unusual career path means he doesn’t just see AI as technology. He sees it as part of a wider human system — where behaviour, incentives, and culture matter just as much as code.

“We’re a creative company. We don’t want to replace journalists or designers. But we can use AI as a force multiplier — speeding up research, consolidating information, and helping people make decisions faster.”

That human-first but pragmatic outlook shaped every decision in Hypebeast’s adoption journey.


Why Most AI Adoption Efforts Fail

AI is no longer optional. Boards and executives expect adoption. But many teams fail to deliver value. Why?

Sami points to fear — not of the technology, but of being abandoned.

“The reason people are fearful around AI isn’t the tech. It’s because they don’t trust governments or institutions to look after them if jobs disappear.”

This distinction matters. If AI is framed as replacement, staff feel threatened. If it’s framed as empowerment, they engage.

Psychology backs this up. People resist change when they fear loss of control or status. The solution isn’t just better tech — it’s better framing. Data leaders need to talk about AI as a tool that supports their teams’ value, not one that makes them redundant.


Lessons from Hypebeast’s Adoption Journey

1. Frame AI as a Force Multiplier

At Hypebeast, AI is not a substitute for creativity. It’s an assistant. Tasks like research, summarisation, and trend monitoring were made faster and easier, while final judgement stayed human.

“We’re not trying to replace jobs. We might automate manual tasks, but we won’t remove the human side.”

This framing reassured staff that their value remained central — and made AI a welcome tool, not a competitor.

Implementation guidance:

  • What to do first: Communicate clearly that AI enhances, not replaces.
  • Tools: Introduce AI where speed and consolidation matter (e.g. research, summarisation).
  • Watch-outs: Don’t oversell — focus on real, modest gains.
  • Benefit: Higher trust and willingness to experiment.

2. Focus on “Unsexy” Use Cases

Flashy AI demos rarely translate into real value. Sami leaned into the unglamorous but high-impact tasks: scanning social feeds, packaging intelligence, automating logistics and finance.

“We leaned into use cases that aren’t super sexy but free up time.”

By cutting drudgery, staff had more time for meaningful creative work.

Implementation guidance:

  • What to do first: Audit manual processes that drain time.
  • Tools: Simple AI agents for monitoring, reporting, logistics.
  • Watch-outs: Avoid over-investing in “showcase” projects.
  • Benefit: Faster results, more trust in AI.

3. Create a Curiosity Gap

Perhaps the boldest move was delaying access. For 10 weeks, Sami drip-fed teasers: short demos showing what AI could do.

“It was 10 weeks before we gave anyone access — on purpose… By the launch, adoption went from 3% to 95% in three weeks.”

Scarcity created FOMO. Instead of pushing adoption, Hypebeast created pull.

This contrasts with most change management programmes, which over-prepare staff with slide decks, training, and handholding. Sami flipped the playbook — and in his industry, it worked.

He’s quick to note it wouldn’t fit everywhere. In banking or pharma, where regulation and compliance demand rigour, leaders may need a heavier hand. But in fast-moving creative industries, curiosity was the lever.

Implementation guidance:

  • What to do first: Hold back, and drip-feed examples.
  • Tools: Short demo videos or snippets to spark curiosity.
  • Watch-outs: Don’t launch too early before demand builds.
  • Benefit: Rapid, voluntary adoption.

4. Kill Zombie Agents Without Sentimentality

Hypebeast set strict benchmarks: daily or weekly agents had to hit 80% usage. If they weren’t used, they were deleted.

“If an agent isn’t used, we delete it. No sentimentality. It’s not failure — it’s iteration.”

That unsentimental approach kept adoption high and avoided wasted energy.

This is another overlooked lesson. Too many teams keep “zombie tools” alive because someone invested time or money. Sami’s product mindset — measure, test, delete — freed his team to focus only on what added value.

Implementation guidance:

  • What to do first: Define usage thresholds per agent type.
  • Tools: Usage dashboards, adoption metrics.
  • Watch-outs: Don’t cling to underused tools.
  • Benefit: Consistently high adoption, leaner portfolio.

Measuring Adoption: Beyond Usage

Usage was Sami’s primary metric, but adoption measurement can and should go deeper. Data leaders can track:

  • Time saved on manual tasks
  • User satisfaction with AI support
  • Error reduction in workflows
  • Frequency of repeat use across teams

By triangulating usage with impact, leaders can see not just whether AI is being used — but whether it’s making a difference.

This level of measurement is critical for building trust with executives and avoiding the “we bought AI, but what did it achieve?” backlash.


Real-World Impact

Hypebeast reached 97% adoption within three weeks of launch. Staff now use AI agents daily across journalism, retail, logistics, and design.

The before state was one of downsizing, high pressure, and manual workloads. The after state is a team with more time for creativity, backed by systems that take care of drudgery.

Instead of creating fear or resistance, the approach built curiosity and trust. AI is now embedded in workflows, freeing staff to focus on creative and strategic tasks.


Putting It All Together

Hypebeast’s success was not built on hype or heavy-handed change management. It came from reframing AI as support, solving practical problems, creating demand through scarcity, and cutting what didn’t work.

The results speak for themselves: adoption consistently above 90%, and a team that sees AI as an enabler, not a threat.

For data leaders, the playbook is clear:

  • Start with framing — AI enhances, it doesn’t replace
  • Automate boring tasks first
  • Create demand with curiosity, not forced adoption
  • Be ruthless with underperforming tools

Your Next Move: A Leader’s Checklist

Before your next AI rollout, ask yourself:

  1. Have I made it clear AI is here to support, not replace?
  2. Am I solving everyday pain points first, not chasing flashy demos?
  3. Can I create demand by showing value before rolling out access?
  4. Do I have clear benchmarks for adoption and usage?
  5. Am I willing to cut what doesn’t deliver?

Tick those five boxes, and you’ll be far closer to adoption that actually sticks.


Dig Deeper

This article is based on Data Matas Episode [X] with Sami Rahman, Director of Data & AI at Hypebeast.

📺 Watch the full conversation: https://www.youtube.com/@matatika
🎙️ Listen to the podcast: https://www.matatika.com/podcasts/