Value: why data products are worth the effort

08.02.2026 | 5 min Read
Tags: #data products, #measurable data value

Why data products are worth the effort – and how you measure the impact.

Why should you care about data products?

Because the alternative becomes costly in terms of time, friction and risk.

The pattern repeats itself: Team A delivers “some data”, Team B copies and adapts it (to be safe), Team C builds on the copy. The result is fragmentation: multiple variants of “the same truth”, more run cost, more reconciliations and lower trust.

In practice it quickly becomes a systemic problem:

  • Different definitions live side by side, with nobody knowing which one is “official”.
  • Change becomes dangerous, because nobody has an overview of who uses what.
  • Errors become silent, because discrepancies are discovered after the fact (often in a meeting, not in a pipeline).
  • Reuse does not happen, because the safest strategy becomes creating your own copy.

Data products are not a magic solution. But they give you a tool to make a few deliverables stable enough that others dare to build on them, and to make cost/benefit visible enough that you can prioritise.

Instead of many variants per team – one data product everyone can reuse.
Instead of many variants per team – one data product everyone can reuse.

Signs that you need a “product” and not more tables

Do you recognise several of these?

  • “Where does the number come from?” takes longer than making the decision.
  • Two dashboards show different answers to the same question.
  • People say “we don’t dare use that table” (and create a copy).
  • Small changes upstream create large ripple effects downstream.
  • You have things you never dare to delete, because the consequences are unknown.

When costs are measurable, value should be too

Every data product has concrete costs: compute, storage, operations, governance, support and further development. Then it is reasonable to be concrete about value as well, without overcomplicating things.

A practical approach is to talk about cost to serve in a simple way. Not to “chargeback” everything, but to prevent products from living forever on autopilot:

  • Technical operations: execution, storage, observability, incident handling.
  • Coordination cost: reconciliation of concepts, clarifications, meetings, “can you explain this join?”.
  • Change cost: breaking changes, migrations, parallel interfaces, clean-up. Read more about what this costs in practice in chapter 8 on change and deprecation.
  • Risk cost: wrong decisions, compliance breaches, loss of trust (which causes people to stop reusing).

When you do not make value explicit, you get one of two unfortunate effects:

  1. Everything gets to live (because “maybe someone needs it”), or
  2. Everything gets measured and becomes noise (queries, number of tables, “popularity”), and you end up optimising for the wrong things.

The goal is not a perfect business case. The goal is a good enough decision basis for prioritisation and portfolio management.

Concrete value hypotheses and metrics (examples)

Examples of value hypotheses that are easy to explain, and metrics that can be used without building a data warehouse to measure the data warehouse.

Data productValue hypothesisMetrics (1-3)
Forecast/features for planningBetter planning by having multiple teams build on the same features/model output over timePlanning variance - share of decisions using the product - time spent on clarifications
Orders/order lines (event logic)Less reconciliation through one status model and one time logicdiscrepancy cases - time to period close - number of duplicate variants
Metrics layer (KPI)Consistent KPIs across reports and teamsKPI conflicts - adoption in reports - time from change to updated consumption
Customer 360 – CoreMore consistent customer view and less misusetime to customer lookup - discrepancy in customer figures - number of consumer environments
Consent foundationLower compliance risk and fewer campaign errorsdiscrepancies/complaints - time from consent to effect - share of campaigns using the foundation
Product catalogue for analyticsBetter assortment and pricing decisionscoverage on key attributes - number of local mappings - time spent on data cleansing

Customer 360 – Core is the same example we used in the business canvas and MVDP. The metrics here connect directly to the value hypothesis in the canvas.

Three types of signals are worth distinguishing:

  • Adoption (usage): who uses the product, and in which surfaces? (consumer environments, dashboards, jobs, API clients)
  • Impact (benefit): what improves when the product is used? (time saved, fewer discrepancies, higher accuracy)
  • Operating cost (run cost): what does it cost to uphold the promise? (incident rate, compute, support time)

You do not need all three from day 1. But if you never look at run cost, the portfolio quickly becomes a collection of “perpetual projects”.

How to measure without overdoing it

  1. Choose one value hypothesis per product (in plain language, not consultant-speak).
  2. Choose 1-3 metrics that you can actually monitor monthly.
  3. Tie the measurement to a decision: “If X happens, we do Y.” (Example: if adoption is low over 90 days, consider downgrading to component or adjust the product surface.)

Metrics that change slowly are fine — as long as they change in the right direction when you take action.

Common traps

  • “Number of queries” as value: high query rate may mean value, or just poor modelling and a missing semantic layer.
  • “Number of tables” as progress: more engine room is not the same as more reuse.
  • Value without a link to a decision: you measure monthly, but nothing changes based on what you see. Then measurement is decoration.
  • “This is for everyone”: unclear customers give unclear priorities, which give unclear product surfaces, which give copies.

Make the value hypothesis clear enough that you can say no to things that do not help the customer, and yes to what makes the product more reusable over time.

The next question is: how do you share the product without consumers creating their own copies? See chapter 5 on interfaces and data contracts.



author image

Magne Bakkeli

Magne Bakkeli is co-founder and senior advisor at Glitni. He has over 25 years of experience in data platforms, data governance and data architecture, and led the Data & Analytics team at PwC Consulting for 12 years. He has built and modernised data platforms across energy, FMCG, finance and media.