Why do some data projects fail to deliver on their promise? According to David Hardoon, CEO of Aboitiz Data Innovation and former CDO of UnionBank, the problem lies in not looking at data projects through a business lens.

Join David’s lively conversation with Harbr co-founder Anthony Cosgrove as they discuss:

  • The required shift in mindset to data and AI being seen as a business product that drives business results
  • The importance of starting with smaller, achievable projects that demonstrate value and gain user adoption
  • Why having a strong data foundation is essential for successful AI implementation

Transcript

Anthony Cosgrove: Hi, and welcome to episode five. I'm Anthony Cosgrove, the co-founder of Harbr. And today I'm joined by David Hardoon, who is the chief executive officer of Aboitiz Data Innovation. David, welcome to the podcast.

David Hardoon: Thank you very much for having me on.

Anthony: I know a bunch about Aboitiz Data Innovation — we've obviously been working together for a while, but it would be great to hear what you're doing because it's very unique. I think you guys were super early in seeing a bunch of opportunities. So tell us more.

David: First and foremost, thank you for the opportunity to talk about what we're doing. I have to give you a bit of a story about the group that we're operating in. As I promise you, this is not a marketing plug — it's just to give the context.

This group is a 150-year-old conglomerate, Philippines-based, even though we're actually headquartered in Singapore. It's very diversified. Financial services, the entire gamut. Energy business, distribution and generation. Everything underneath the sun, the exception of nuclear. Agriculture, food, cement construction, airports, Coca-Cola bottling. It's extensive.

We were born out of the question of, as the pressure of digitalization is surrounding everyone, are we maximizing the opportunities of data within the organization? I know this will sound corny, so I apologize for it, but it's really true — it's how do you make AI real?

When you talk to people out in the industry and you ask them, "Are you doing AI?" Almost or largely, the response is yes. When you ask, "Is it in operations? Are you running the business on the back of decisions? Is it productionalized?" Usually the answer falls to "Well, it's in the sandbox, it's a pilot, it's a proof of concept, it's a proof of value" — whatever acronym you would like.

David: That really shows you there's a bit of a hockey stick situation between all this plethora of ideas to actually getting it over the mantle. We were really born to use innovation, use data, use technology, use all the amazing stuff that's out there, but be very focused on how we make a difference in the business. Which ultimately is measured by top and bottom line, but that's where we were born. And having done it now for a couple of years, we're also working with others in the industry, kind of replicating the things that we've been doing for ourselves.

Anthony: Cool. So you've gone from sort of cutting your teeth internally around how to make that impact on business processes and now replicating the same thing with external customers, partners, suppliers, that kind of thing. Are those use cases typically cookie cutter in the same or are you taking the learnings and then reconfiguring them, reworking them in those different environments?

David: That's a very good question. While ideally you would like to get to the cookie cutter, the reality when I think about data and processes — there will always be a percentage, anything from a small percentage to a substantive amount, depending upon the respective infrastructure, that needs to be done.

Let me give you a couple of examples. I usually get eyebrows raised — predicting the quality of shrimp, compressive strength — literally dealing with cement and something that can help actually result in substantive improvement in operational process, literally reducing costs and more importantly on the ESG side, actually reduce 35 kilotons of CO2 emissions quantifiably. To energy, whether it's about operations and predictive asset maintenance to the usual gamut of financing across the lifecycle, financial crimes prevention, just so we're all clear. It's about those things. And then more importantly, it's not just about doing the techie stuff. It's like I mentioned earlier — how do you create that pathway, execution and productionalization?

Anthony: Obviously here, the podcast is really about data products and the mindset around data products, which I think you're already sharing some of in terms of impacting the business, focusing on the value, trying to expand that as a business in and of itself. How do you think about data products? How do you approach data products? What's your take on that space? It's kind of hot right now. A lot of people trying to define it and grapple with it. What's your take?

David: I don't know if it's a subtle bias or my perspective heavily influenced by “Pinky and the Brain”, you know, trying to take over the world, but the underlying definition — in order to answer that question, I'm going to dive deeper into it.

The way I think about a data product isn't actually a data product. It's a business product. And that's what I meant by the Pinky and the Brain reference — because for me, data is the business. It's usually not realized as such. When I say data is the business, people say "No, it's the business and it generates data." And I'm like, no, actually data is the business.

If you look at today, what really differentiates between the winners and the accelerating folks are the ones that have made that realization and have truly intertwined the output of a business being data and the data representing the business in terms of continuous online improvement.

So going back to your question — it's a business product that ultimately is used. Now from a techie perspective, naturally, you could have a data product, an actual data structure that's feeding into different facets of a business or different applications. I get you. And I know this may be a bit of semantics, but to me, to get it to work, to get it to be used — and we were joking offline that one of the realizations in the world of data, and I think this may be a shocker to some people out there — business doesn't care about data.

So you need to change the terminology and realize that no, this is a business product that's being fed and facilitated by underlying data. That's to me how it's absolutely critical to have a perception because then how you use it, how it gets operationalized, how you monetize it, how you measure it becomes easier to a certain extent.

Anthony: So essentially the data is a critical component and dimension of it, but the whole thrust for you is it comes from the business, it represents a business and it goes back to the business to change and deliver value?

David: It is the business.

Anthony: Love that. Amazing. When you think about the products you've built and you think about that concept of what it is — whether it's getting the right type of concrete or getting the best type of shrimp with particular inputs, outputs and conditions — what are the critical lessons you've learned along the way in terms of doing that well and getting good results?

David: There are many. One could literally write a book on those aspects. But if there are two critical takeaways, at least for me — and again, I don't want to sound corny or just regurgitating catchphrases — but data, an entire stack of data, meaning whether it's a data model, actual data structure, all the way to a data product, meaning let's say an application or AI, it doesn't matter how you look at it — it truly is a journey.

What that means in terms of internalizing it is while you may be able to take some shortcuts or some steps to accelerate the journey, it has to be a journey. Let me make it more concrete. Let's say you're talking about the shrimp prediction or cross-sale or whatever. Usually when you have a conversation with a business, they'll say "Oh David, we want 95 percent accuracy." They will conceptualize some number as where they want to be.

Well, what's the baseline? Where are you at now? If you're at 80 percent now and you want to go to 90, fair enough. But it's not uncommon when someone says "Oh, we want to hit 90," but your baseline is 60 or 40. That's where, if I could speak to my younger self, I'd say "relax." Because however counterintuitive it may sound, it's also more believable, even though you may know you can hit 90. It's more realistic to operationalize.

David: If I come to you saying I can give you a car that goes a hundred kilometers today and tomorrow say I can give you a car that can go a thousand kilometers an hour — are you going to drive it? You may say "Well, okay, that's really cool, but we're just going to watch someone else do it." If I say "No, I can give you something that can do 200, 250," you'll be like "Oh, awesome. Let's try it out."

Same thing — it's a journey, and we need to take that journey. Same thing with quality. You can't go from zero to hero. You actually need to sequentialize it. It's a contextualization. And the first word that popped into my head is conditioning, but I don't mean it in that sense. It's acclimatizing. The business acclimatizing, people acclimatizing to what are the possibilities that can come from it.

Anthony: Just on that lesson learned, it feels like there's two dimensions to that, right? There's what can I build and how good does it need to be — do I need perfection, do I need five nines, or do I need 20-30%? But I think it's probably the first time it's come up around actual user adoption — if that difference is substantial and you are affecting change in a business process, how much change can be absorbed and accepted and actually drive value?

David: I know it sounds counterintuitive. Like, what do you mean? If I give someone such a phenomenal result, they should be jumping for joy. Theoretically, yes. And of course you may have people who will be ready and able to adapt to it and accept it. But again, from a behavioral perspective, most of the times you'll be countered with disbelief.

So that's number one. And then the second one, especially with LLMs — in fact, a buddy of mine used an LLM to generate an image to show it. Look, we all like fancy toys and gizmos and gadgets, but it goes back to — do you need it? The LLM generated this funny picture of a beautiful silver sports car with a harvester on the front of it. And then the next picture is the tractor and you're like — the tractor works, maybe you need a slightly faster tractor. Maybe you don't need a sports car in front of it.

I am the first person to be the biggest advocate of new technologies and capabilities, new stuff and all that, but that's also another really critical aspect. Like I was talking to someone — I can't remember which part of the bank — and they literally had this realization of "Oh we wanted this API real-time connectivity" and then we realized we don't need it in real time. So they have this real-time bus when reconciliation happens weekly. Again, it's really important to identify the opportunities of improvement, the opportunities of new versus just doing it for the sake of doing it.

Anthony: That idea around it being fit for purpose is a good one. I think it does bleed into your first point. Do you feel that there's almost quite a lot of noise and distraction in the system because we're getting so many new technologies coming through, and there's always a crowd asking "How should we use this?" and "What can we do there?" It's almost tech-first rather than "I need to solve these problems, what technology may have come along that actually helps to move the needle?"

David: Look, I want to say two things. When I was a young budding consultant, I remember a buddy of mine that was working in the government calls me up and says, "Hey David, do you mind coming and doing a demo for my director?" I'm like "Yeah, sure. Whatever you need. What do you want me to do?" He says "Oh, can you do forecasting?" I'm like "Yes, what would you like me to demo?" "Forecasting." "Yes, what would you like me to demo?"

That's another example of sometimes it's like "Oh, but we just have to use it." I'm sure you've been in conversations where people ask "How do we use our data?" or "How do we use AI?" What they're actually saying is "What are the opportunities that we're not currently maximizing on?" It isn't really “How do we use data?”, it's not really “How do we use AI?”

David: That's why I'm saying that to bridge that gap, unless you're really focusing on the more research and innovative side of data, it has to have that business lens. It has to have the usage lens of what to do with it. And the results are phenomenal.

Once you get that thing going — and I'm saying literally we have deployed, validated running improvements. And I know if I've said it on day one, the person would be like "David, excuse me, but that's just BS" — 500 percent improvement in terms of customer engagement. Predictive maintenance, the OEM for large equipment — it was a boiler tube in coal plants — usually gives you four to five days of early signs for failure. We took that four to five days using data models, data products, and then the business and pushed it all the way to 15 days.

These are significant things. The example I mentioned earlier on shrimp, on cement — I can keep going. The value is simply this: any organization that doesn't maximize on the value they have access to is shortchanging themselves. However, the difference between conceptually thinking about it and truly using it is that shift in mindset of looking at it from the business lens. Look at the stuff that really ultimately will make an underlying business impact.

If I quote Donald Rumsfeld — you know what you know, you know what you don't know, you don't know what you know, and you don't know what you don't know. You never start from you don't know what you don't know. You always start with what you know.

Anthony: That's a great point. Awesome. These lessons learned are super interesting. I guess I've always said you were ahead of the market in terms of driving AI and driving into productionized use cases that deliver business value. Where do you see this space heading? What's next? What's on the horizon?

David: That's a great question. I have this thing — learn from the past, leverage today and shape the future. But we have to look at the foundational element of it. No data, no AI. It's as simple as that. So however it may be difficult — challenging technologically, call a spade a spade, sometimes political — it is absolutely crucial and critical to unlock it. Because once you are able to unlock it, those downstream opportunities and where I believe it's heading can truly be facilitated.

Let me give you another example, since we are a conglomerate. We look at different industries. One example in the Philippines is financial inclusion, financial sustainability, where you're trying to onboard people who are thin file, no file — in other words, don't have a credit history, don't have a financial recorded background, but yet at the same time, they're a customer of the power utility for years.

So hold on a second. You have someone who's been diligently paying their utility bills, but they cannot open a bank account or get a loan. Why don't I simply use this dimension of information in order to provide a service that previously I couldn't? You see, suddenly, these possibilities are truly phenomenal.

I'm a massive believer that the technological capabilities of doing it in a governed and secure manner is here. To me, that is a non-conversation anymore. So that's the foundation. And then going from that, to where are we heading? It's exactly that. Even just to plug in gen AI, it's the sharpening that it's not the tool. It's the outcome and the use of the tool that truly makes a fundamental difference.

What we're finding is, we started talking about data models, but it's about the business models. Just like about data monetization, which I actually don't believe in — I believe in business monetization, product monetization. In other words, it's not about what the data costs me, it's what the access to the data costs me. And then what is the commercial agreement on top of that arrangement.

David: It could be the exact same data which may result, as in my example earlier, in a financial lending opportunity. Maybe there it's a certain type of financial commercial relationship. But maybe the exact same data could also be used for something else, which will have a different commercial arrangement associated with it. The way we need to think about it and the way we're heading is business products because data is the business.

Anthony: Awesome. So to summarize — increasingly strong foundations where we're connecting data silos so that we can access as much of the data as we can in new and novel ways. And then this overlay of starting to think in a bit more of a sophisticated way. I think we'll have a bit of background noise because you've joined us from an airport lounge, which I really appreciate. But this slightly more sophisticated sense of where's the value, what does value look like? How do we capture it? How do we measure it? As opposed to this very simplistic "I create some data and I'm selling it to you for 10."

David: Let me give you another example. Usually when you walk into a traditional setup, and the moment you bring up the term "Let's talk about a data sharing agreement," you will literally see the blood run from everyone's faces. Because usually the catch-22 of a data sharing agreement is "What's the purpose?" "Well, we don't know what the purpose, we want to see the data in order to identify..."

But that's exactly what I mean — it's the anchoring on the purpose is providing a service. The purpose is providing a product. The purpose is providing a new capability. Data is simply a facilitation of it. Today we can provide access, whether it is actual sharing, whether it is simply integration, facilitation. The various means of creating this ecosystem of data — one has to realize that that is just that, it's a highway. What I mean is it shouldn't be the focus. It actually is almost like a solved problem.

Why has Generative AI suddenly just ballooned and blown up like it has? Again, shocker — has nothing to do with the AI. What resulted is the interaction with the application. The ability of having a conversation where you go "Whoa, it's answering my questions, like literally we're having a conversation." Yes, there is AI. Yes, there's cool stuff beneath it. But let's be honest, that cool stuff actually existed for a while. It's the ability of putting it together.

Same thing now about data. If anything, we need to be focusing on how we put it together, so that that downstream application, the business product, has that same result of people going "Whoa, that is phenomenal. That is like, it's almost obvious and we should be doing that."

Anthony: Amazing. Great. Well, look, we're going to have to end it there sadly. Thank you so much for joining us. That's been a super interesting conversation and I'm always happy to do it again about shrimp-based data products, which is excellent. So if anybody's listening and wants to learn more about what David's up to, you can follow him on LinkedIn. Also check out aboitizdatainnovation.com. If you're building data products, a data marketplace or a data mesh, check out harbrdata.com. Thanks for listening.

Harbr logo in navy color

Parlay — a data exchange platform built on Harbr and powered by AWS — enables the Aboitiz Group and its stakeholders to securely publish, share, and collaborate on data across ecosystems. Learn more in our case study.

Read Case Study

SUBSCRIBE FOR MORE