30 March, 2026

Why Your Analytics Stack is Failing You (And What to Build Instead)

Why Your Analytics Stack is Failing You (And What to Build Instead)

Most ecommerce brands running £2-20M in revenue are spending between £500 and £2,000 a month on analytics tools. Triple Whale, Polar Analytics, Lifetimely, Supermetrics, Northbeam, sometimes two or three of these at once. And yet when you ask a founder what their LTV is by acquisition channel, or what their true contribution margin was last month, they open a spreadsheet.


That's not a coincidence. It's a structural problem. And it's one that no amount of dashboard subscriptions will fix.


Over the past six months, we've spoken to dozens of DTC founders and Heads of Growth in the £2-20M GMV range. The conversations vary in detail, but the pattern is remarkably consistent: fragmented data, tools that don't quite work and a growing feeling that the analytics setup should be better than it is. This article breaks down what we've observed, why the standard approach keeps failing, and what the alternative looks like.

Five tools, five different numbers


Here's a scenario that will sound familiar to most operators reading this.


You check Shopify for yesterday's revenue. You check GA4 for traffic and conversion rate. You check Meta Ads Manager for spend and ROAS. You open Triple Whale (or Polar, or whatever you're using) for the "unified" view. And then you open a Google Sheet where you've been manually tracking your actual numbers because none of the tools quite agree with each other.


Revenue in Shopify doesn't match revenue in Triple Whale. GA4 is showing 40% of your traffic as "unassigned." Your attribution tool says Meta drove 200 orders but Shopify says 180. And your finance lead is pulling a completely separate set of numbers from Xero that don't reconcile with any of the marketing data.


This isn't an edge case. 89% of the people we spoke to (20+) described some version of this exact problem. The specific tools vary, but the outcome is the same: nobody fully trusts any single number and the team spends its time reconciling data rather than acting on it.


One brand we spoke to was paying £2,500 a month for Triple Whale and Polar Analytics combined, and the Head of Growth was still updating a manual Google Sheet every morning, the same one he'd been using for three and a half years. Another had a £150,000 discrepancy between Shopify and their aggregation tool that nobody could explain. A third was paying £1,000 a month for an analytics platform that couldn't correctly pull through their variable costs.


The cost of this isn't just the subscription fees. It's the 30-60 minutes a day the founder or Head of Growth spends pulling data instead of acting on it. It's the retention issue that builds for two weeks before anyone notices because the data to spot it doesn't exist in one place. It's the budget decision made on numbers that turn out to be wrong.


Why the tools aren't the problem


Here's the part that most people don't want to hear: the tools aren't bad. Triple Whale is a good product. Polar has useful features. Lifetimely does one thing well. The reason they keep failing is not because they're poorly built, it's because they're trying to do two fundamentally different jobs at once, and doing neither properly as a result.


Every one of these platforms attempts to ingest your data, model it, and present it, all within a single, rigid product. They pull from the Shopify API, the Meta API, the Google Ads API, and they apply a standardised set of transformations to turn that raw data into something you can look at on a dashboard.


The problem is that standardised transformations don't work for non-standard businesses. And every business is non-standard.


Your shipping costs aren't a flat percentage, they vary by basket contents and destination. Your COGS aren't uniform, they depend on the product mix and your supplier terms. Your customer segments don't map cleanly to the categories the tool provides. You have a weekly sale that massively skews your daily metrics. You sell across Shopify and Amazon and the tool only handles one properly.


When the reality of your business doesn't fit the template the tool provides, the data breaks. When the data breaks, you stop trusting it, and when you stop trusting it, you go back to the spreadsheet.


We've seen this cycle play out with almost every tool in the category. One founder told us he cancelled both Lifetimely and Polar after a few months because he "wasn't using them enough to justify the cost." Another described their analytics platform's output as data he "just didn't trust." These aren't bad operators, they're operators whose businesses outgrew the rigid template their tools imposed.


The missing layer


The reason this keeps happening is that these tools are presentation layers without a proper data foundation underneath.


Think about it this way. A dashboard is only as good as the data feeding it. If that data is pulled from five different APIs, transformed by a proprietary black-box system you can't inspect, and joined using logic that doesn't match your actual business, then the dashboard is just showing you wrong numbers in a prettier format than your spreadsheet.


What's missing is the data layer itself: a single, unified, tested, transparent foundation where all your data lives, joined and modelled to reflect how your specific business actually works.


This isn't a new concept in the broader data engineering world. It's how every serious data team at a larger company operates: raw data is ingested into a warehouse, transformed by modelling tools, tested for accuracy, and then served to whatever visualisation or analysis layer sits on top. The modelling layer is the product. The dashboard is just one way to consume it.


In the ecommerce analytics space, almost nobody does this. The tools skip straight to the dashboard and hope the data sorts itself out underneath. When it doesn't (and it reliably doesn't for any business with even moderate complexity), the whole thing falls apart.


What it looks like when you build the data layer first


When the data layer is built properly, three things change immediately.


First, you get one number. Revenue is revenue. It's calculated once, from one source of truth (Shopify), with documented logic you can inspect. There's no more checking Shopify against GA4 against Triple Whale. The data has been ingested, joined, and reconciled in a warehouse. If a number looks wrong, you can trace exactly how it was calculated and find out why.


Second, the morning triage compresses from an hour to minutes. When data is pre-modelled and pre-tested, you don't need to pull it from five platforms. You open a dashboard that already has yesterday's performance, with statistical health indicators telling you whether a metric change is worth investigating or just normal day-to-day variance. The daily routine shifts from "gather the data" to "act on the data."


Third, you can answer questions that were previously impossible. What's our LTV by acquisition channel? Which product categories have the strongest repeat purchase rate? What's our true contribution margin after COGS, fulfilment, and payment processing? These questions require joining data from multiple sources at the customer and order level. No single platform provides it out of the box, because the answer depends on your specific cost structure, channel mix, and business model. A properly modelled data layer makes these questions trivial.


Crucially, the data layer supports multiple ways of accessing it. Dashboards for daily triage, direct warehouse queries for custom analysis or data activation (pushing segments to Klaviyo, feeding LTV data into Meta). And increasingly, conversational AI agents that let anyone on the team ask a question in plain English and get an interpreted answer in seconds, without needing SQL or waiting for the one person who "knows the data."


The transparency question


One of the most overlooked aspects of the standard analytics tool approach is opacity. When Triple Whale or Polar shows you a number, can you inspect how that number was calculated? Can you see the SQL, the transformation logic, the assumptions? In almost every case, the answer is no. It's a black box. You either trust it or you don't, and as we've established, most operators end up not trusting it.


A proper data layer is fully transparent by design. Every model, every transformation, every test is visible and inspectable. If your LTV calculation doesn't look right, you can open the model and see exactly how it's derived, step by step. This transparency isn't just a nice-to-have. It's the mechanism by which trust is built. You trust the number because you can verify it, not because a SaaS dashboard told you to.


What this means in practice


We're not suggesting every ecommerce brand needs to hire a data engineer and build a warehouse from scratch. That's expensive, slow, and unnecessary for most teams.


What we are suggesting is that the data layer, the unified, tested, modelled foundation underneath your reporting, should be treated as the product, not an afterthought. Get that right, and the dashboards, the analysis, the AI tools, and whatever comes next all work properly because they're built on data you can trust.


Get it wrong, and you'll keep cycling through analytics tools every 12 months, wondering why none of them quite deliver what they promised.


The answer was never a better dashboard. It was always better data.

Analytics and Reporting

Ready to reclaim your time and get clear on growth?

Ready to reclaim your time and get clear on growth?

Ready to reclaim your time and get clear on growth?

Book a demo to see how Crux gives you complete performance visibility with health monitoring that tells you exactly what needs your attention.

Book a demo to see how Crux gives you complete performance visibility with health monitoring that tells you exactly what needs your attention.