Data Culture

Why "Being Data-Driven" Has Nothing to Do With BigQuery

A X over a BigQuery console image

The real reasons most companies fail to use data well — and what actually separates the ones that don't.


Here's a number that should give every executive pause: only 4% of companies fully capitalize on big data.

That stat comes from Bain's Global Head of Advanced Analytics, and it holds up when you look around. Despite billions poured into data warehouses, BI tools, and data science hires, the overwhelming majority of organizations are not actually data-driven. They are data-adjacent — they have data, they talk about data, they budget for data. But they don't use it to make meaningfully better decisions.

Meanwhile, "big data" search interest peaked around 2017 and "data science" hit its apex around 2020. Both are now on a slow decline on Google Trends. Gartner's hype cycle predicts this for virtually every emerging technology: we climb a peak of inflated expectations and slide into a trough of disillusionment. Data is no different.

The question isn't whether data is valuable — it clearly is. The question is: why do so few organizations actually extract that value?

The answer has almost nothing to do with which tools they're using.


The Symptoms We All Recognize

Spend enough time in the data space and you start to notice patterns. Ninety-four percent of companies say data is essential to growth. Yet 63% can't gather insights in the time required to act on them. Nearly 40% of business domain experts — the people who are supposed to use the data — can't even define what "data-driven" means.

The data team and the strategy team operate in parallel universes, producing outputs that rarely influence each other. Dashboards get built, then ignored. Analysis gets delivered, then filed away. A data scientist spends three weeks on a model, and the decision it was meant to inform has already been made.

This is not a technology problem. These organizations often have excellent technology.


Five Reasons Data Fails (and None of Them Are About SQL)

When researchers actually ask companies why their data initiatives underperform, the answers are remarkably consistent — and remarkably human.

Culture comes first. Thirty percent of companies cite corporate culture — specifically, an organizational unwillingness to change — as a primary challenge. This isn't about data literacy. It's about whether leadership actually believes in making decisions differently than they have before. If the CEO has been running on gut instinct for 20 years and it's worked, they're not going to pivot because an analyst built a nice chart.

Training and adoption follow closely. Twenty-six percent of organizations are concerned about the quality of training and rate of adoption for their internal data products. You can build the most beautiful dashboard in the world, and if the person who's supposed to use it doesn't understand what it's measuring or trust that it's accurate, it will collect dust.

Unrealistic expectations compound everything. Twenty-five percent of companies cite time constraints from expecting too much, too fast. Data infrastructure is slow to build well. Hiring takes time. Instrumentation takes time. Trustworthy analysis takes time. When leadership expects a data team to deliver meaningful insights within 90 days of being stood up, they're setting everyone up for failure.

Business/IT misalignment is a quiet killer. Twenty-two percent point to a disconnect between business needs and technical solutions. Engineers build systems optimized for performance and reliability. Analysts need systems optimized for queryability. Operators need outputs optimized for decision-making. When these groups don't communicate deeply, the product of one rarely serves the needs of the others.

Lack of technical expertise plays a role, too. Thirty-two percent say they can't extract value because they lack the right technical skills. But notice: this is the only item on the list that's actually about technology. And yet it's rarely the root cause — it's usually a symptom of the other four.


What "Data-Driven" Actually Requires

If it's not about tools, what does being data-driven actually take? Four things — in order of importance.

People first. You need analysts who can translate data into business language and strategists who know how to ask the right questions. A warehouse full of data is inert without someone who can interpret it, contextualize it, and communicate it to a decision-maker. The best data teams are part translator, part detective.

Process second. Data-driven organizations have consistent workflows for producing, reviewing, and acting on insights. Not one-off analyses delivered in an email on a Friday afternoon. Not a Slack message with a screenshot of a chart. Structured cadences: weekly metric reviews, monthly retrospectives, quarterly planning grounded in data. Decisions don't just get made — they get made in reference to a shared body of evidence.

Executive support third. This one is non-negotiable. Leaders who model data use — who ask "what does the data say?" in meetings, who hold teams accountable to metrics, who don't override analysis with intuition whenever it's inconvenient — create organizations where data actually influences behavior. Leaders who don't do these things create organizations where the data team eventually stops trying.

Infrastructure last. Yes, tools matter. BigQuery, Redshift, Amplitude, dbt — these are excellent products that unlock real capability. But they're the last piece, not the first. A company with great culture, clear process, strong leadership support, and mediocre tooling will outperform a company with the opposite configuration every single time.


The Org Structure Question

Even organizations that get the above right face a structural challenge: where does the data team sit?

Three models dominate. In a centralized analytics team, all analysts report to a single function — often a Chief Data Officer or VP of Analytics. The advantage is consistency: metric definitions align, tools standardize, and the team develops shared expertise. The disadvantage is responsiveness — business units compete for analyst time, and analysts can become detached from the problems they're trying to solve.

In a decentralized model, analysts are embedded directly in business units — product, marketing, finance, growth. The advantage is intimacy with the domain. The disadvantage is fragmentation: you end up with five different definitions of "conversion" and three different sources of truth for revenue.

The hub-and-spoke model tries to thread the needle: a small central team sets standards and owns shared infrastructure, while dedicated analysts live inside business units. It's the most common structure at fast-growing startups and mid-size companies, and when it works, it works well. It requires investment in both coordination and communication to sustain.

The right model depends on your stage, your team size, and your strategic priorities. But the wrong model, held too rigidly, creates misalignment that no amount of tooling will fix.


The Real Benchmark

It's worth being explicit about what data-driven actually looks like — because the bar is often set in the wrong place.

An organization is data-driven when:

  • They have the talent, expertise, and willingness to create value from data

  • Their strategies are guided and measured by data

  • They have the discipline to test, learn, and adapt based on evidence

  • Data use has executive representation and active support

An organization is not data-driven just because:

  • They use BigQuery or Redshift

  • They've hired a CDO or CTO

  • They're a technology company

  • They have data scientists on the payroll

That last list describes inputs. The first list describes outcomes. Most organizations conflate them.


Start With the Mirror, Not the Tech Stack

The first question for any organization that wants to become data-driven isn't "what tool should we buy?" It's "how do we currently make decisions?"

That audit is uncomfortable. It requires honesty about how often decisions get made on instinct, politics, or inertia rather than evidence. But it's the only honest starting point, because every gap you identify is a place where better data practice could create real value.

From there, the path is straightforward even if it isn't easy: identify the decisions where data could have changed the outcome, work backward to what information you'd have needed, and build the capability — cultural, processual, structural, technical — to produce that information routinely.

The tools will follow. But they are never, and have never been, where this starts.



Want to build the data strategy foundation that actually works? The Data Strategist course walks through every layer — from business ecosystem mapping to metrics design to organizational structure.

Blog

Categories

About

Contact

Explore

Home

Categories

Archive

Connect

About the Author

Newsletter

Contact