Up to now decade, corporations have spent billions on knowledge infrastructure. Petabyte-scale warehouses. Actual-time pipelines. Machine studying (ML) platforms.
And but — ask your operations lead why churn elevated final week, and also you’ll doubtless get three conflicting dashboards. Ask finance to reconcile efficiency throughout attribution techniques, and also you’ll hear, “It depends on who you ask.”
In a world drowning in dashboards, one fact retains surfacing: Knowledge isn’t the issue — product pondering is.
The quiet collapse of “data-as-a-service”
For years, knowledge groups operated like inner consultancies — reactive, ticket-based, hero-driven. This “data-as-a-service” (DaaS) mannequin was fantastic when knowledge requests have been small and stakes have been low. However as corporations turned “data-driven,” this mannequin fractured below the burden of its personal success.
Take Airbnb. Earlier than the launch of its metrics platform, product, finance and ops groups pulled their very own variations of metrics like:
Nights booked
Lively consumer
Obtainable itemizing
Even easy KPIs diverse by filters, sources and who was asking. In management evaluations, completely different groups offered completely different numbers — leading to arguments over whose metric was “correct” moderately than what motion to take.
These aren’t know-how failures. They’re product failures.
The results
Knowledge mistrust: Analysts are second-guessed. Dashboards are deserted.
Human routers: Knowledge scientists spend extra time explaining discrepancies than producing insights.
Redundant pipelines: Engineers rebuild comparable datasets throughout groups.
Choice drag: Leaders delay or ignore motion as a consequence of inconsistent inputs.
As a result of knowledge belief is a product downside, not a technical one
Most knowledge leaders assume they’ve an information high quality situation. However look nearer, and also you’ll discover a knowledge belief situation:
Your experimentation platform says a characteristic hurts retention — however product leaders don’t consider it.
Ops sees a dashboard that contradicts their lived expertise.
Two groups use the identical metric title, however completely different logic.
The pipelines are working. The SQL is sound. However nobody trusts the outputs.
It is a product failure, not an engineering one. As a result of the techniques weren’t designed for usability, interpretability or decision-making.
Enter: The information product supervisor
A brand new position has emerged throughout high corporations — the information product supervisor (DPM). In contrast to generalist PMs, DPMs function throughout brittle, invisible, cross-functional terrain. Their job isn’t to ship dashboards. It’s to make sure the appropriate folks have the appropriate perception on the proper time to decide.
However DPMs don’t cease at piping knowledge into dashboards or curating tables. The perfect ones go additional: They ask, “Is this actually helping someone do their job better?” They outline success not by way of outputs, however outcomes. Not “Was this shipped?” however “Did this materially improve someone’s workflow or decision quality?”
In follow, this implies:
Don’t simply outline customers; observe them. Ask how they consider the product works. Sit beside them. Your job isn’t to ship a dataset — it’s to make your buyer more practical. Meaning deeply understanding how the product matches into the real-world context of their work.
Personal canonical metrics and deal with them like APIs — versioned, documented, ruled — and guarantee they’re tied to consequential selections like $10 million funds unlocks or go/no-go product launches.
Construct inner interfaces — like characteristic shops and clear room APIs — not as infrastructure, however as actual merchandise with contracts, SLAs, customers and suggestions loops.
Say no to initiatives that really feel refined however don’t matter. An information pipeline that no crew makes use of is technical debt, not progress.
Design for sturdiness. Many knowledge merchandise fail not from dangerous modeling, however from brittle techniques: undocumented logic, flaky pipelines, shadow possession. Construct with the idea that your future self — or your alternative — will thanks.
Clear up horizontally. In contrast to domain-specific PMs, DPMs should continuously zoom out. One crew’s lifetime worth (LTV) logic is one other crew’s funds enter. A seemingly minor metric replace can have second-order penalties throughout advertising, finance and operations. Stewarding that complexity is the job.
At corporations, DPMs are quietly redefining how inner knowledge techniques are constructed, ruled and adopted. They aren’t there to wash knowledge. They’re there to make organizations consider in it once more.
Why it took so lengthy
For years, we mistook exercise for progress. Knowledge engineers constructed pipelines. Scientists constructed fashions. Analysts constructed dashboards. However nobody requested: “Will this insight actually change a business decision?” Or worse: We requested, however nobody owned the reply.
As a result of government selections at the moment are data-mediated
In at present’s enterprise, practically each main choice — funds shifts, new launches, org restructures — passes by means of an information layer first. However these layers are sometimes unowned:
The metric model used final quarter has modified — however nobody is aware of when or why.
Experimentation logic differs throughout groups.
Attribution fashions contradict one another, every with believable logic.
DPMs don’t personal the choice — they personal the interface that makes the choice legible.
DPMs be sure that metrics are interpretable, assumptions are clear and instruments are aligned to actual workflows. With out them, choice paralysis turns into the norm.
Why this position will speed up within the AI period
AI gained’t change DPMs. It should make them important:
80% of AI mission effort nonetheless goes to knowledge readiness (Forrester).
As massive language fashions (LLMs) scale, the price of rubbish inputs compounds. AI doesn’t repair dangerous knowledge — it amplifies it.
Regulatory strain (the EU AI Act, the California Shopper Privateness Act) is pushing orgs to deal with inner knowledge techniques with product rigor.
DPMs aren’t visitors coordinators. They’re the architects of belief, interpretability, and accountable AI foundations.
So what now?
When you’re a CPO, CTO or head of knowledge, ask:
Who owns the information techniques that energy our largest selections?
Are our inner APIs and metrics versioned, discoverable and ruled?
Do we all know which knowledge merchandise are adopted — and that are quietly undermining belief?
When you can’t reply clearly, you don’t want extra dashboards.
You want an information product supervisor.
Seojoon Oh is an information product supervisor at Uber.
Day by day insights on enterprise use instances with VB Day by day
If you wish to impress your boss, VB Day by day has you lined. We provide the inside scoop on what corporations are doing with generative AI, from regulatory shifts to sensible deployments, so you possibly can share insights for max ROI.
An error occured.

