How do you build — and sell — a value proposition for data products? Join our conversation with Neil Forrest, a data product GTM leader at Equifax, as he shares his extensive experience from startups to large enterprises. Drawing from both worlds, he offers practical wisdom on product development, sales strategies, and organizational challenges in the evolving data landscape.
Here’s what you can expect to learn:
Anthony Cosgrove: Hi and welcome. This is episode eight of Data Product Mindset. I'm Anthony Cosgrove, the co-founder of Harbr, the private data marketplace platform. Today I'm joined by Neil Forrest who has worked across a range of tech and tech consulting companies in go-to-market functions and is now head of product-focused sales and consulting at Equifax.
Neil's approaching this from quite a different perspective where he's really focused on finding market fit, meeting specific customer needs, and how you go from talking about a data product to closing a sale and ultimately delivering value.
So Neil, why don't you tell us a bit more about what you're up to at Equifax?
Neil Forrest: I've been at Equifax just over six months now. Past the finding my feet period, the honeymoon's over, as they say.
It's been an interesting experience. Prior to Equifax, the majority of my career has been in the startup and scale-up environment.
Equifax is a much larger entity. We've got multiple new product initiatives we're bringing to market simultaneously. And obviously much bigger teams. Working in the startup world, if you wanted to have a meeting, you just get everyone to close their MacBooks for 30 seconds and huddle around. At larger companies, that level of intimate agility isn't possible. So I've been learning that it requires much more orchestration and building those internal relationships in order to make progress.
That said, the appetite to be agile and to think like a startup exists, I'm relieved to say, in Equifax. So when I bring something like more radical startup notions of how we might go about things, they're well received. So that's good.
Anthony: With a big data business like Equifax, as you said, there are many data products and no doubt various kinds of derivatives or customizations or solutions happening for individual customers. So how do you think about data products? And I guess there's two parts to that question:
One is: How do you think about data products in the context of what they are?
But also, how do you think about data products in the context of somebody that has to then go and take those to market, work with customers, and help really to deliver value and change outcomes for them?
Neil: Let's start with the value. That customer problem, that customer pain point for perceived benefit that you think you've identified. Can you quantify that value? In some cases, simply by providing a unique source of data, the value is evident. That's fine and clean, whatever the customer then goes on to do with it.
The customer controls the value proposition a little bit in that instance, whereas the more you curate the data, the more you actually contextualize it to those clear problem statements and customer pain points. I think you own more of the value proposition. And that's been interesting to me in the startup world because just providing a prediction if we're talking AI and ML, or just providing a proprietary data set that you're able to differentiate from others, that's nice and easy and clean. The product team has a relatively straightforward experience, but you don't know how much value you're delivering on that end of the spectrum.
Whereas further up that value chain, the more you provide the solution as opposed to just a pure data service, you get your telemetry, you get your analytics, you can see what the customer's actually doing with it. Rather than making an estimation of value, you can quantify the value.
So that I think is an interesting aspect of it. In that latter case, you're creating a greater product, there's more to be done, there's more to be managed, but I think that's where companies are most successful. Just on that, I guess it's changing a little bit in terms of your world and the data marketplace and the advent of data clean rooms and these new ecosystems where companies can actually proffer up their data in a different way. For me, that still feels nascent. I think that's going to change the data product proposition because I don't think it's well understood by a lot of companies.
Anthony: So that journey you're talking about where you go from almost a static table or a piece of data that's hopefully differentiated, hopefully has some value associated with it — but what happens after that requires work from the customer so they have to process it, analyze it, join it, whatever it is they need to do to get to an outcome — versus going further up that value chain where you're now delivering something that's more tailored to their specific needs and you can be clearer around exactly what value you're delivering to them.
Do you find that there's a point where that kind of has to stop as well? Beyond a certain point, is it almost always very individual, or are you finding that you can actually get to a point where the majority of the market, let's say 50% or 80%, are happy with what you've provided and don't need any further customization? Because I think this is the challenge with data products, and I think a lot of people are wrestling with this — when does a product stop being a product and start being a custom solution? And how far up the chain can I go before that is superfluous?
Neil: Agreed. I mean, if you think about that notion of product market fit, one of the early signs that you have identified a customer problem and developed a solution that the market will accept — I'm not sure exactly what the academic perspective on this would be — but we would try and sell the same proposition three times. Don't sell three different versions of the proposition, but try as far as you can to find three customers who agree that yes, if you can provide this product or solution as you've described it, I would buy it. I believe that would solve my problem.
What's key in that, again learning firsthand and then having it reinforced by Geoffrey Moore's "Crossing the Chasm" and those points of reference, is your early adopting innovative customers. You have to find the right personas there. You have to find those customers who are willing to accept the rough around the edges, initial deployment, the customer who's willing to accept the MVP and work with you to ideate it. That's key — if you pick the wrong customer, you can end up building this incredibly overly engineered solution to their requirements. Startups get really excited when they find their early adopting corporate partner, but if that company isn't the right fit, they need to have that kind of “fail fast” mentality too. Otherwise it can kill the business.
Anthony: So that journey you're talking about where you go from almost a static table or a piece of data that's hopefully differentiated, hopefully has some value associated with it — but what happens after that requires work from the customer so they have to process it, analyze it, join it, whatever it is they need to do to get to an outcome — versus going further up that value chain where you're now delivering something that's more tailored to their specific needs and you can be clearer around exactly what value you're delivering to them.
Do you find that there's a point where that kind of has to stop as well? Beyond a certain point, is it almost always very individual, or are you finding that you can actually get to a point where the majority of the market — let's say 50% or 80% — are happy with what you've provided and don't need any further customization? Because I think this is the challenge with data products, and I think a lot of people are wrestling with this — when does a product stop being a product and start being a custom solution? And how far up the chain can I go before that is superfluous?
Neil: Agreed. I mean, if you think about that notion of product market fit, one of the early signs that you have identified a customer problem and developed a solution that the market will accept — I'm not sure exactly what the academic perspective on this would be — but we would try and sell the same proposition three times. Don't sell three different versions of the proposition, but try as far as you can to find three customers who agree that yes, if you can provide this product or solution as you've described it, I would buy it. I believe that would solve my problem.
What's key in that, again learning firsthand and then having it reinforced by Geoffrey Moore's "Crossing the Chasm" and those points of reference, is your early adopting innovative customers. You have to find the right personas there. You have to find those customers who are willing to accept the rough around the edges, initial deployment, the customer who's willing to accept the MVP and work with you to ideate it. That's key — if you pick the wrong customer, you can end up building this incredibly overly engineered solution to their requirements. Startups get really excited when they find their early adopting corporate partner, but if that company isn't the right fit, they need to have that kind of fail-fast mentality too. Otherwise it can kill the business.
Anthony: They're probably violently agreeing with you at this point and wondering what you might share in terms of how to find that right kind of early development partner. Someone who doesn't want it fully engineered to the nth degree and perfect before they can comment on value or realize value, but rather someone who's more aligned to that more rough and ready "I'm happy to go on this journey" approach. What do you look for? How do you find that person?
Neil: The conversation needs to be candid and quite transparent. This is far from the finished article. Especially through the lens of my experience in machine learning and AI — can we predict X, can we optimize Y? We literally don't know. It would be foolish to say yes we can at this stage. It would depend on many things, including your data and so forth.
Having those sales conversations about a predictive solution or a deep learning solution where not a single line of code has been written yet — that's quite a tricky path to navigate. Customers and prospects would ask you, "How accurate will the model be?" for a model we haven't built yet. A smarter person than me taught me to say in that situation, "Well, how accurate does it need to be in order to deliver value for you?"
It's hard because that innovator, early adopter persona can sometimes be obsessed with the tech itself, the tech for its own sake. Never lose sight of customer value. What is the problem that we're looking to solve? What is the pain that we're looking to relieve or eradicate? Never lose sight of that in the conversations.
The executive agenda can be, "We should be doing AI, we should be doing machine learning. Let's incorporate that technology into our business without necessarily understanding where it would be transformative.” So again, you really have to make sure that the customer understands what they intend to do.
Anthony: Earlier on, before we hit record, we were talking about MEDDIC and some of the parallels between the sales pipeline and the shift-left movement in development where you try to get the hard, difficult things out of the way as early as you can through discovery, through conversations, through testing. So that you then have hopefully a smoother path to the end of the day — in the case of development, through to a production deployment of some code or product.
It feels like there's a bit of tension though, right? Because you're talking here about how we engage with these early adopters, being very transparent, going on a journey. Neither of us may necessarily be able to meet MEDDIC criteria, but then you've also got the tension that you do need to sell. Do you see those two things as working in parallel, or do you see it as there's a time for this more exploratory collaborative approach, and then once you've maybe sorted out those three customers and got them to a certain point, then you click into more of a standard sales process? How do you view those two things?
Neil: It's a good question. I guess for those who are less familiar, MEDDIC is an abbreviation for one of the many sales qualification criteria. You're ensuring that you've identified a champion or a sponsor for a project, that there's some executive buy-in, that budget may or may not exist. It's keeping you honest that this deal can happen.
Is the sale of really nascent products compatible with that? I think you'd probably just change the weighting. You've got to find that desire first. You've got to find someone who is willing to go on that journey with you. Someone needs to pay for it, unless you're of a generous spirit and you want to run a free POC. But sometimes you need the validation more than anything to get to a working model solution. You may need customer data — otherwise, it's all just theoretical. Is simply getting your first prototype up and running with a customer and value enough to the business to get you going?
So yeah, pricing — what do you charge for something that you haven't built yet, that you can't define how much value it's going to deliver? All of these aspects are very iterative. Where you start out, even — I mean, you'll know this probably many times over — the problem you set out to solve on day one, the genesis of the business or the startup or even the product, is highly unlikely to be the problem that you end up productionizing a solution and starting to really scale that sales effort around. Perhaps sometimes it is, but you've got to keep an open mind along the way that actually there's maybe an adjacent problem that has greater value than the one you set out to solve.
Anthony: Couldn't agree more. While we're thinking about those lessons learned, what do you feel are the critical ones that you've learned in the data product space? What are the mental models that you hold around what works, what's to be avoided, what to try to do really well? What would you share with the young version of yourself?
Neil: That's a good question. I think one of the principles above all others is that the best idea wins, or hold your beliefs lightly. Dogmatic thinking, any kind of degree of certainty can really be problematic.
We simply don't know if something will work. We try that whole fail fast approach and try to evidence that you've got a hypothesis or you've got a problem statement. Test it, validate it as fast as you can, as many times as you can. Keep doing it. It's not a one and done. It's not a box tick in a CRM or a satisfactory Asana or Jira ticket that you've just cleared off the board. Customer value is never ending. It's incremental. That's the mindset that you've got to have. What you truly believe to be correct two weeks ago may not be true today. It's not easy. I think it comes more naturally to some of us than others, but if it doesn't come naturally to you, you need to build that mechanism internally or you should probably find something else to do.
Anthony: I was speaking to someone last week and they made a comment — I'm sort of paraphrasing here. They were in a larger corporate having done some startup stuff before, and they were saying that they struggled with some internal people who didn't understand that failure was likely and was okay. They were finding it really problematic because it meant that a lot of stuff had to happen up front. A lot of stuff had to be hidden away. Failures which should have just been celebrated as "this hasn't worked, we've proven something doesn't work, that it's wrong, that there's nothing there" had to be sort of dressed up as something that they weren't. Is that something that you've seen over your career? You've done corporates, you've done startups, you've done tech, you've done data.
Neil: I think the larger the organization, the more there is that challenge. I wouldn't necessarily say it's a culture — you're reporting up in terms of progress, and so there is that sense that perhaps executives only want to hear good news.
I would challenge that. Perhaps there are cases where that's true, but highlighting, escalating, looking over the horizon to identify risks or areas of concern — I think one should always proactively do that. But always try to bring up a suggested route forward to accompany the challenge. Here's the problem we've identified based on what we understand it to be, this is what we would advocate doing. You've got to be dispassionate and separate the individual ego or the company ego from the problem. We've got a problem to overcome. It's nobody's fault per se. We just need to put some bright minds on it and work through it.
The other aspect where it's hard for corporates to adopt and embrace that sort of fail fast and super agile approach is the organizational structure doesn't necessarily allow that because they've gotten so big. There's a team for this and a team for that. All those teams need to come together. Product in a corporate context versus product in a startup must be quite distinct from each other, possibly more distinct than maybe other functions. You've still got to sell in sales, you've still got to create demand and raise awareness in marketing in both contexts, but product in a larger corporation — I suspect you need to be the orchestrator. There are so many stakeholders that must all come together in order to realize your overarching product ambition.
So I think that must be a real challenge. The final thing is the timelines that corporates work to. Budgets are set, product roadmaps are somewhat more set in stone. I think there are certain organizational and rhythmic challenges to the fail fast thing and failure just as a concept. There's work to be done there. Failure is the natural state for new product development. Success is the exception that justifies all the hard work and effort required. That doesn't necessarily coexist well within corporate mentality in some cases, but I think it needs to in pockets. Whether the entire organization needs to act like that is another question, but where rubber hits the road in brand new products and brand new concepts, you've got to create a safe space to put your hand up and say, "I'm not sure if this is going to work."
Anthony: Agreed. And we're almost out of time, but I'm always interested in the answer to this question. You've been in the data product space and certainly the data analytics, AI machine learning space for a really long time. A lot's going on right now with gen AI and LLMs and RAG and a whole host of kind of not new, but new to market and new to delivering value against use cases technology. So where do you see the space heading? What do you think is up next? How are things going to evolve?
Neil: I feel we collectively, the business world, seem to fixate on one thing at a time to the detriment of all the others. The second AI revolution is upon us and it's all generative AI, ChatGPT and LLMs is where all the buzz is, all the investment tends to be focused around that.
Having worked in data science and machine learning over the last 10 years, there are numerous other technologies, capabilities, models that aren't getting any airtime anymore, or at the moment, because of that singular conversation around generative AI. It's not to say that there isn't value to be had and a place for generative AI, but the assumption would be that all of those other predictive optimization, all of the other uses of sophisticated algorithms and maths to make businesses clever — that's all done, every company's deployed each and every one of those different models or solutions where it would add value to the business, and now the only thing left to do is something to do with generative AI.
It's like, no, no, no. I think the vast majority of businesses are still uncertain as to how to extract the most value from algorithms, predictive, deep learning. I think there's still an awful long way to go before that has just become completely commoditized and understood in terms of the value of it. So I find those aspects of machine learning and AI more exciting, more interesting than I do the generative.
Anthony: Interesting perspective.
Neil: I think there probably will be a period of reflection where the generative AI hype eases off a fraction. It does create a little bit of bandwidth for some of the other machine learning deep learning technologies to remind the world that they exist and still have huge value and application in business. So I hope that's the case.
In terms of what comes after generative AI, I think the aspect of it I find interesting is the agent piece, which obviously is being talked about now. Rather than just summarizing or answering questions or being able to create content at speed, the ability that you can have the agent that represents Anthony, the extension of your digital self, where you've given it firstly an understanding of you and also your consent to start interacting with other services or other agents.
A good friend of mine said, "I can ask Alexa or generative AI what time the next London to Leeds train is, but they won't book a ticket for me. It won't pick a seat. It doesn't remember that I like to face forward and have a table ideally." That's what he was excited about, and it just got me thinking the same thing — book the GP appointment, book the train appointment, have my implicit consent. Maybe some blockchain to empower that agent to act on my behalf, and I feel like that's probably the next super exciting thing to happen.
Anthony: I don't disagree on that. Super interesting insight. Also neatly brings us to the end. Neil, thank you so much for sharing your experiences. As I said at the start, this is really good getting a very sort of commercial kind of go-to-market focus on the whole data product space. I think everyone's going to find that super useful.
If anyone's listening and they want to continue the conversation, feel free to connect with Neil on LinkedIn. And if you're building data products or a data marketplace, check out harbrdata.com where you can also find all the previous episodes of the podcast. Thank you for listening.