Read time:
Author:
Anthony Cosgrove
Co-Founder
Date published:
16.12.2024
Expand Table of contents
Collapse Table of contents

When I talk to data leaders about what’s most important to them, I get a fairly consistent set of answers: quality, speed, accessibility, security, ease of use. But the value of all of those things is predicated on one thing: speed of access.

Accelerating access to data leads to quicker iteration; faster, cheaper projects; better quality outcomes; and a host of other benefits. Here’s why I think speed of access is the single most important metric for every CDO.

But what about quality?

To draw reasonable conclusions from data, it needs to be accurate. I get that. But just as in product management, when you work with data, you don’t want to wait until something is absolutely perfect before you start using it. I regularly speak to CDOs who feel compelled to get every single duck in a row before they begin to streamline data access.

That includes timeliness, structure, governance, cleanliness, update frequency, etc.

The thing is, solving those problems without first enabling rapid data access is a Sisyphean task. How do you know the structure and format are right? How do you know if the update frequency is appropriate? The answer won’t reveal itself through contemplation. It’s the data consumers who have the answers to those questions. And those answers will vary based on the use case.

This idea is not new. This is the challenge that every product manager faces. The difference is that we’ve taken so long to get to grips with the challenges of dealing with data. But really we should just be applying product management principles. And the most fundamental of those is feedback. These feedback loops cannot happen without first getting access to the data.

If you can’t solve the problem of data access, you miss out on getting business-critical feedback, the ability to work with new tools and technology, improvements to data quality and compliance, and much more.

Without rapid access to data, you get a host of problems:

  • Projects are too long and too expensive
  • The number of viable use cases drastically reduces
  • Oh and by the way, it’s probably not going to be perfect. Because it hasn’t been forged in combination with the data consumer and their use cases. It’s been forged in isolation.

Yes, there will be obstacles

The challenges to getting access to the right data are well documented.

Data silos make it difficult to know what’s available, where it’s located, and how to get access to it.

Rapid access shouldn’t come at the expense of maintaining proper governance over data assets.

Security and compliance are more important than ever. Controlling who can access data, what they can do with it, and where (if anywhere) it can be moved are super important.

Data access: a chicken and egg problem

The CDOs I speak to tend to talk about looking for a fit between data and use case. The problem is, there won’t be a clear value proposition for a data product until the full cycle is complete.

But if we go back to the data quality issue, you won’t know the level of quality needed until you know your use cases. And you don’t know your use cases until you try and get value from the data.

So what I suggest is that by prioritizing data access, you can accelerate the process of getting a specific and prioritized list of improvements needed regarding quality, timeliness, format, structure, location, update frequency, governance, etc.

Once you’ve got that feedback, and you repeat the process across multiple use cases, you can then prioritize those use cases based on the cost of work and the value you get from it — a cost-benefit analysis that you couldn’t have done without this iterative process.

Solving these in isolation won’t work.

Get into the product mindset

Look at any good product team, and you’ll see consistent behavior: they don’t wait until they have a perfect product. They make a prototype, they get market feedback, they iterate.

For data products, accelerating data access is crucial for getting into this product mindset. You can’t develop data products with any decent cadence unless you have rapid data access.

And when you’ve developed a data product, part of it being a product is that it facilitates rapid data access to data consumers. It’s a virtuous circle.

When data is productized, your downstream consumers will have quicker access to data.

It’s all about data consumers

It all starts with the data consumers. They’ll tell you how often, what quality, which format, the location they want it, the legal requirements, etc. These attributes will define the specification. They’ll tell you how valuable the data product is or isn’t. That, in turn, will tell you about economic viability.

Without rapid data access, none of these things are achievable. That’s why it should be the number one metric, because everything else relates and depends on this.

Customer spotlight: Driving data access

Aboitiz Data Innovation (ADI) is the data science and artificial intelligence arm of Aboitiz Group, a conglomerate with revenues over $3.9 billion. In May 2022, ADI launched Parlay, a data exchange platform built on Harbr.

Getting data into the right hands was a key challenge for ADI’s CEO David Hardoon. Parlay allows users across the dozens of companies that make up the Aboitiz Group to collaborate on data projects and drive business value. Read more in our case study.

Read case study