Working with data & metrics - with Ben Yoskovitz, Founding partner @Highline BETA
Full Transcript
Welcome to the Product Bakery. My name is Christian and I'm here today with my co-host Alex, as well as a guest from Toronto, Ben Joskowicz. Before we start, as always, a short reminder to our social media channels, which are linked in the podcast description. So feel free to follow us, give us a like and help us growing the audience by sharing this episode or the podcast in case you like it and you think it's helpful for other people, your network, your colleagues, or even your family. So Alex, I think today it's time to talk about data, isn't it? Absolutely. I want to give you an overview of who our guest Ben is. I think Ben has a really interesting track record. So he's not only an entrepreneur, investor and product expert, but he's also the author of Lean Analytics. So I think it's a coincidence that we're talking to a lot of like people from the Lean series, but who knows, maybe there are more to follow. No, but I think to introduce Ben, so Ben, obviously like he worked in different startups, he worked at different corporations. He also funded a couple of own companies. And then just to name a few, he was VP product, Verage Sale, and he was VP product at GoldenStand. He also worked as a director of product management at Salesforce. And he's currently a founding partner at Highline Beta, which is a startup co-creation company. But maybe Ben, you can explain a little bit more around what Highline Beta is actually about. Sure. First, thank you for having me. It's great. And always happy to talk about analytics and data. So we'll get into that, I'm sure. So Highline Beta is a company that I co-founded almost five years ago. And the idea is that we are a hybrid venture studio and venture capital firm. So we work with large companies, global Fortune 500, Fortune 1000 companies to help them identify areas of opportunity outside of their core business. We're very much interested in helping big companies with what we would call growth innovation, new business models, new markets, new problems to solve for new customers or existing customers, but not incremental innovation, although that's very important, really about growth innovation beyond the core. And then to help those big companies figure out how to do that sort of innovation, either internally with internal teams or externally with startups. And when we look to external innovation, working with startups or using our venture studio to build companies from scratch, then our venture capital fund can invest. So in a way, we run a service organization working with big companies. That's one part of the business model. And then we invest, which is the second part of the business model. So it's piecing together a number of things that myself and my co-founders have been doing for years, which is investing in startups, building companies, working in corporate innovation, and trying to connect all of those dots in a meaningful way. We can talk more about Highline Beta as well, but I know today it's really about data. So we can jump right into that as well. And I guess you just mentioned it, right? You've seen a lot of different companies, you've been around for a long time. So I think to get the conversation a little bit into the data, one thing that I was curious to hear from you is how you've seen data awareness change over time and where we are today. Yeah, it's definitely increasing and I think significantly. So if you think about when we wrote, myself and Alistair Kroll wrote Lean Analytics, which is already eight, nine years. So it's been a long time. There weren't a lot of companies sharing their data, for example, writing stuff about it, talking about their data. There were examples and we found examples, of course, in case studies, but there's much, much more communication and sharing. And I think that's probably the most important piece because everybody always wants to benchmark themselves against others. And that's very difficult to do when nobody's sharing data and that's just happening more and founders are sharing more privately, but also publicly. So I think data awareness has increased substantially. And then the access to others and how they're doing and how they think about it has also increased. So that creates just a bigger and better learning environment for people to get into understanding data faster. I remember data awareness started rising up for me in 2017. I just looked up my Amazon order history and in 2017, the 8th of November, I bought that book together with Shampoo for some reason, I don't know why. And the thing was for me back then, it was a game changer. I was, let's say now between mid-level, between junior and mid-level back then as a product manager. And what really has blown my mind away was the difference between leading and lagging KPIs. And maybe to jump directly into the depth here. So you said companies getting more and more into data-driven work. What do you think are the most important tools when you work with data that you can use to better plan and understand your product? Yeah. So if we're thinking about software tools, for instrumenting things and everything else, there's many. They change all the time. They improve all the time. So I actually spend a lot less time focused on that side of things and more about the understanding what data I should be tracking and why. And I think that remains to me, the biggest challenge. Look, there are lots of mixed panel, heap, Google analytics and trying to, okay, what do I use? And a lot of companies eventually do their own instrumentation because they just need their own very specific stuff that they're doing. The biggest challenge is figuring out what to actually track. And so your example or your reference of leading indicators versus lagging indicators and understanding the difference and how to find them. Most of the tools will do the job that you need most of the time, certainly early on. So it's less about the tools and more about how you think about it and what you're looking for. And what I have observed is to also for the audience to just share quickly the difference, right? If you look at the leading KPIs, we look at, for example, number of sales, something that refers to the past. I'm sorry, lagging KPIs in that case. If we take a look at leading KPIs, we look something that give us an indicator if you are on track, for example, the traffic on a certain page or something like that. And what I've noticed is that still these days, so many people and companies looking at lagging indicators and underestimating the importance of leading indicators. But I would also love to hear how you see that. No, that's fair. I mean, lagging indicators are a lot easier to find and a lot easier to track because they're there and you can track pretty much anything you want. All of it is a lagging indicator until you figure out how it impacts something else. And that's really the key is, you know, what is happening today in your product, on your website, in your service that tells you what's going to happen in the future. The thing that I often think about between leading and lagging indicators is how do I know who my best customers are? Sometimes we call them, you know, crazy, the customers that use our product more than anybody else that love it. Sometimes those are the early adopters. Let's hope they are the early adopters. But if you can figure out patterns, behavior patterns of your best users, there's usually a signal in there somewhere that says, oh, they use this X times a week. They do X, Y, Z things in this order and they never churn and they keep buying more from me. Maybe I should figure out how to get more users to do those three things in that order to get them hooked. And so everybody tracks lagging indicators. It does tell you what has happened. The leading indicators do predict the future. And for me, I think you start by looking at your best users and trying to see if there's any patterns or commonalities between your best users that might signal something that is creating, turning them into your best users. So I think this is a very good point. Also, when we talk about like how to find and how to define the metrics that are important, maybe from your experience as a company, if I don't have any experience or let's say if we're not there yet in terms of maturity when it comes to data and analytics, like how would I best approach this initial like finding the right metrics, defining what works for us, like what are our best users, for example, like how do I set the right priorities at the beginning? Yeah. So in the book, we really talked about two variables. It's never going to be this simple, but two variables that help you figure out what are the numbers that matter. The first one was the stage of your business. So I think that really does, it does matter. Are you very early stage? Are you pre-product or are you a giant company with hundreds of millions of dollars of revenue, 10 products in market or whatever the case may be? And that would be scale. And there are a few stages in between in the book that go from empathy at the earliest stage to stickiness, to virality, to revenue and scale. So first, I think you have to figure out genuinely what stage are we at? And then the other key component is what type of business are we in? Are we an e-commerce business? Are we a software as a service business? And again, in the book, we generalize things. Software as a service, we used one sort of freemium model to explain that, but there are lots of different business models for software as a service. So how do I make money? And how do I generate revenue? And how do I create actually, how do I create value first? And then how do I make money? And then what stage I'm at? And if we start to look at those, we can start to figure out what are the metrics that matter. For me, most of the time, it is about stickiness and engagement. It is about understanding the value exchange that you're offering to a user or a customer. You're giving them a product. They're getting some value out of it. In exchange, they're giving you money or data or attention or something. And proving that you're creating enough value is incredibly hard for startups. It's true of big companies as well. But that to me is where most of our energy, certainly if we're as product managers, should be focused on are we creating enough value and how are we measuring the value? Usually that's some measure of stickiness, some measure of engagement. Yeah. And when it comes to that, I think before we then go into the depth of the product itself that we want to define from the startup until the end with KPIs, there's this term of this North Star metric or North Star KPI that you are also highlighting a lot, which I, by the way, see very rarely defined in companies. So how do I identify first of all a North Star metric and what are tools to get there? We call it the one metric that matters. And what we suggested was that at any stage, at any point in time for any company, there is one metric that really helps drive your entire business and it will change over time. So that's important. It's not one metric that you measure your entire business by forever because your business from idea to scale is going to change radically. We've seen different implementations of the one metric that matters. And then of course, as a company grows and gets more sophisticated, there really isn't just one number, of course. That one metric actually becomes what I would describe as a health indicator for the business. It's the metric when we're in offices or now we can do it digitally, I suppose. But on the TV, it's the metric that we're all looking at that tells us, is the business generally healthy or less healthy? And below that, there are going to be other metrics that matter for me if I'm in customer success, if I'm in sales, if I'm in product. And there are smaller metrics that matter that bubble up to this health indicator. A lot of companies use revenue as their health indicator, again, which makes, I guess, to some extent makes sense. But again, I still prefer, and this is probably the product manager in me, I still prefer a metric around usage of some kind to be the one metric that matters for as long as you can until you start to grow revenue. And then you're like, if revenue is dropping, we know there's a problem. So it might serve as a health indicator. Usually when you get to a certain size where you're looking at the one metric that matters as a health indicator, you can't actually figure out what the problem is off of that metric. But there was one thing I was struggling with when I was defining that one metric that matter, whether it's on a company level or on team level, the challenge that I was always facing was that it's either a lagging indicator or a leading indicator. And at some point, I just said to myself, okay, whenever I define a Northstar metric, I take two of them. So one is a leading indicator and one a lagging indicator, because otherwise, we will always turn around and, okay, it looks too much in the past or it looks too much into the future. But I just said to myself, I always go with both in case I have to define those. I think that's interesting. I think also what is, as you start to measure leading indicators, because you can start to predict, to some extent, predict the future based on that metric, it means one metric is helping you predict another metric. So often you find these things. I think that's a good point, yeah. They come in pairs, which is interesting. And the one metric that matters is, the truth is, there are many metrics that matter, of course. It depends on the size of the company, the stage of the company. I think the important thing is, so let's say you have a small product team working on a feature. We'll just use that as an example. We're building a new feature in the product. Let's assume we've actually validated that people want the feature. We're not just building it. We're building this feature. We're trying to figure out, how do we measure the value creation of the feature? So it's going to be some usage type metric, right? How often people use the feature, how often usage of that feature drives them to buy more product or whatever the case may be, right? And the important thing there is that whatever we're building and whatever we're measuring has to bubble up to the health indicator. So we build a feature, people use it a ton. That's great, but it doesn't drive more revenue. It doesn't drive more sales. It doesn't drive more reviews on the app store. If they're using it, but it's not actually creating value, maybe it's creating some value for them, but you haven't been able to figure out how to extract value as a result of that, then we have a problem. So there can be a one metric that matters at the team level, at the feature level, at a more granular level, but we have to be able to draw some kind of line to something bigger than that, that helps drive the business forward. Because usage of a feature alone might be wonderful, but it might not actually make a difference to the success of the company. And so I see a lot of people doing work, especially in bigger groups, bigger teams, bigger organizations, where they actually can't tie their work to the success or growth of the company. I can spit out features all day long. I can't tell if anybody cares. I can't tell if it moves a needle that actually matters. That's when you know you're not probably tracking the right things and you're not connecting the dots in the right way. Yeah. And I think to some extent, launching a feature, for example, that might have a lot of adoption could still ruin your company if it comes with a lot of operational costs and no revenue. So definitely also measuring the right things is important. And I think at the same time, I always feel like if you measure the wrong things, this could be very dangerous. And I'm not sure what your advice would be, like what someone should keep in mind in order to avoid measuring the wrong thing. Yes. Measuring the wrong things is usually what it leads to is just wasting time and money, neither of which are infinite. So that becomes a pretty serious problem. Again, for me, you really have to understand how you create value for the end user or customer. If you understand that, chances are you will pick better metrics. If you focus on what matters to you first, growing revenue, whatever the case may be, if you focus there first and you really obsess on those metrics, there's a very good chance you're missing the mark on creating value, right? When you look inwardly at yourself and say, we have to grow revenue. It's fantastic. I want you to grow revenue. I wish you the best of luck. But if all you do is obsess about that, there's a very good chance it takes you down the wrong path. It might mean you close more sales, but you just go after the wrong customer. And then later on, they churn out. And then there are these unintended consequences of focusing on the wrong thing. Often that means we focus on what we think matters internally or what we think drives our business. I would turn that around and say, just obsess over the value that you're creating. And if you understand the value you're creating, or if you don't, you better figure it out. But if you do understand the value, measuring it should be pretty straightforward. And things like engagement and stickiness and usage are a proxy of value creation, right? If they're using it, one would assume they're getting value. I really need to understand. And by the way, again, this is sort of a side path is a lot of companies, as they start to focus more on data, quantitative data, they actually stop collecting as much qualitative data. And that's also where the path goes wrong. It's like, I can look at the data until I'm blue in the face. So that's why I say engagement, stickiness metrics, usage metrics are a proxy of value creation. Oh, they're using it a lot. They must be getting value. I mean, that's a reasonable assumption, but I'd really like to know why they're getting that value and why they're using it. That's how I'm going to figure out who my best customers are. That's how I'm going to figure out how to go find better leads and grow the business. And that's how those dots start to connect. And one problem that I see also with the quantitative data is that quantitative data in itself are quite complex. So it's easy said that we have done a survey, which is, for example, not significant in any way. And we just collect from the moment we shoot it out wrong data. And on the other hand, what I also see often is that people are confronted with so many data that they don't even know where to start. And when it comes to wrong data in that case, you talked also in your book about the vanity metrics. So metrics that make us feel good. For example, sales are increasing. Okay. But we have still a lot of churn and we're not looking at that, but we present it to the CEO because he's happy or she's happy. Yeah. I just love to open up this discussion here because I caught myself many times where I looked at the wrong KPI because I believed it's the right one from an emotional point of view. But from a rational point of view, I was missing it by far. Yeah. Vanity metrics are still very common. They are the metrics that usually go up into the right. They look good. They're the ones that we see in press releases usually. Often, they're still the ones that get presented to investors. As an investor in Highline Beta, as an investor in pre-seed and seed stage, early stage companies, we see it. People pitch us vanity metrics all the time. They do make us feel good, but they don't really move the needle on the business. Fundamentally, they're just not having a huge impact. There's ways of... I would use the word faking or juicing those types of numbers. And you're just punting the ball down the field a little bit and waiting for the inevitable collapse later. So I think that's what happens when you focus too much on vanity metrics. And then I think you're absolutely right about the complexity of the data. That's maybe one of the downsides of a focus on data is now we can basically track everything. You could instrument like a software feature. You can instrument every... by hand, every part of it, even the software tools that will help you do it. And now I literally can see everything going on all the time in my product. And I can just get overwhelmed so quickly. It really does. And that's just a waste of time. So again, if we go back to the basic principles, and this is where I think lean startup, lean startup, I would throw design thinking and jobs to be done in there as well. The principles of let's make sure we understand what the customers want. Let's make sure we talk to customers. Let's make sure we are in fact creating value. It becomes so important because we can get lost in the data. Surveys are a great example of that. You mentioned surveys, right? And a lot of early stage, big companies, of course, do this all the time. They survey customers, but startups still do it. They survey 500 people. They get the results they want because they ask the questions in a leading way, the way they want it to ask them. And of course, you can make the results look how you want them to look. It's not that complicated to do it. And then... So if you don't even know what questions you want to ask, you can't even go to survey. So to me, the quantitative data, again, it tells us what is going on. It really doesn't tell us why. And to build products, we have to understand the why and the what together. And then if we keep focused on the why, I think it relieves the pressure around collecting so much data to try to understand the what at this level of detail that becomes essentially absurd. Yeah. And I totally agree. And I think especially also the way how you collect data and service are a very good point. Due to my background in design, I also see it a lot in user studies and interviews with qualitative data. It's very dangerous to track the wrong thing or ask the wrong questions, get misleading answers that then bring you or get you in completely the wrong direction. And I think this is always a little bit like the danger. I mean, Christian talked about this once about that research can actually be dangerous if you ask the wrong things. Absolutely. It can be terribly dangerous. Yeah, because you can with leading questions always get what you hear. And I think one of my... Faster horses. Faster horses. Yeah. And one of my favorite questions is when you ask users, would you use it? Yeah, of course, most of you will tell you yes. But if you then invest your capital on it to build that company, chances are high that it might fail. Yeah, I absolutely agree. It's interesting because at Highline Beta, we spend a lot of time, again, working with big companies, doing what we describe as discovery work. And discovery work is trying to identify new problems that are potentially worth solving for that company to solve in some way, whether they build something in-house or then find a startup to help them solve those problems. And we spend a lot of time in primary research, talking to people, trying to piece these insights together, trying to find the threads that take us to the right place. And we always say that our job is to find the truth. Our job isn't to build a widget, because building the widget is not the hard part. It's to legitimately find the truth. And if you have already decided what the truth is, and you're just looking for a survey to prove that you're right, you will find a way to prove that you're right. If you're genuinely open-minded about it and truly willing to shut something down or pivot away from it, then you can chase the truth. And then you can use design and customer research and early validation and prototyping and all of these tools we have at our disposal to try to find the actual answer. And the reality is everybody, again, design thinking, jobs through Dunlean startup, it all different flavors, in my opinion, of similar things. And most people still don't really do it because it's hard. Frankly, it's a pain in the ass. And usually doesn't give you the conclusion you want because you go in with, nobody goes in with no bias. Everybody goes in with some bias. So you're already setting yourself up for some form of failure. As soon as you start customer discovery, there's some form of failure that's going to, now we can call it learning. So there's some form of learning that's going to take place. And so people just want to skimp on that part. And again, in Lean Analytics, we had this empathy stage and it was like, what's the data? What's the data? It's like the data is you talk to 10 people and you hope that you find a thread or a pattern. And we can't make it more concrete than that. It's a little bit, you'll know it when you see it, if you're being honest. And if you're not being honest, then you'll come to your own conclusions and you'll move on. Yeah. And I guess you just like also need to learn on how to get comfortable like with this ambiguity of not knowing what's the outcome. And everybody says they're comfortable, or most people will say they're comfortable with it because they know that's the right answer to be comfortable with ambiguity. But when faced with it, most people are not. And most people go in, this is true of startups and founders and it's is true of large organizations as well. Most of the time they go, they already have an idea in mind of what they think the customer wants or the user wants. And they're trying to prove that they're right, as opposed to proving what problem they actually want to solve. So it's just the fundamentals of these Lean startup principles. Everybody understands them in theory. You can get good at them, like any skill, you can get better and better at them. And I've seen people start from zero or start from a place of bias and then get to a, I'm going to go in completely open-minded and let the direction take me where it goes. But it's a hard skill to master. You mentioned the Lean startup mindset a couple of times. And I would like to understand how you, let's take a real world example. Let's say Alex and I, we're having an e-commerce company. We are selling bread and croissants online and we have good KPIs. So you like this. You and your company, you invest into us to make it as real world as possible. And once you have invested, you realize we don't have this data-driven Lean startup mindset. How would you support us to establish such a mindset on a leadership level, as well as on a product manager level who's working on day-to-day business with the product? Yeah, so I think the best tool that I've found is really about going back to assumptions and forcing people to write their assumptions down. From a leadership perspective, a product perspective, and you can do this in any, for any task, any component of a business. The marketing team can do it. The sales team can do it. Everybody can do it. And I, when you force people to write their assumptions and really think about them, if they're open-minded to that, then they realize very quickly just how much of their business is built off of their assumptions. And if they, to your, you know, using your example, you got to a certain place, you were able to raise money. You got some traction. Some of that was because your assumptions were right. So amazing. You had some assumptions and whether you realized it or not, you were actually right. And you figured it out. Some of it's going to be luck. Some of it's going to be timing. And now as it gets harder, it's probably because you have new assumptions. You don't realize it and they're proving to be less. And so that to me is usually the eye opener. And it's the thing that holds us to a higher standard of being honest. And the other thing I'll add to that is we like to, you know, focus on assumptions and then using a design thinking methodology, which is around desirability, feasibility, and viability. We really like to then use the word force, but apply a D, a V, or an F to each of those assumptions. What often happens is you see a lot of feasibility assumptions and a lot of viability assumptions. And then you start to see a lot less desirability assumptions. And then you realize, Oh, I actually don't know who the user is. I don't know who the customer is. I don't know why they care. I don't know why they bought my croissants. And then that starts to open people up to realizing they have to, let's call it take a step back, not throw the business out, not make some massive pivot, but just take a step back, maybe a higher level, take a 20,000 foot view and just think about it more strategically. But I found that assumptions and they're hard to stay honest with. They're hard to write because you can write an assumption that's obvious. But again, so that's like a skill like any other, but that's the thing that I think centers everybody around. Ah, this is what we don't know. This is what we believe to be true. We've got to go test it now. And taking that step back is usually not that much work, right? It's not a big step that you have to do. The only thing you need is a little bit of reserve time as a CEO or with your leadership team and just take, I don't know, a one or two hour session and ask her and challenge yourself and ask yourself, is this the right KPI we are looking at? Or do we need to rethink or create new assumption? Or I'm not sure if hypothesis would be the right word, but to think of, are we right now on track or do we need to reevaluate our initial plan? I think it isn't that much work. It is a behavior change for most organizations. But so again, we're building a soft, we have a software product, we're building a feature. Fantastic. If we just ask ourselves, why are we building it? A customer wanted it and sales told us we had to build it. Why do they want it? So if we just start to dig in, ask why a few times, we may realize we actually don't know why we're doing the things that we're doing. And then that really does lead to asking better questions and then framing those things, not as answers, but as questions and assumptions and say, okay, we believe building this feature will do this thing. Okay. And then oftentimes because the building of stuff is usually not the hard problem. It's building the right thing, making the right choices. That's much harder than executing the work most of the time. So I very much like this assumption and it's easy to forget, right? Like Highline Beta, we get excited about things all the time. Like everybody does. We think there's a play here. Let's our first instance, let's go build it. Let's go build it. And then we always, somebody usually puts their head up and says, why are we doing that? What are we actually testing? What assumption are we, because we could build it, launch it and it will work. And if we don't know why, probably won't be able to scale it. Probably won't be able to replicate that. And now we're chalking it up to luck and timing or just because we're awesome. And that's not a good enough reason to do stuff. So I really do think assumptions, sometimes we call them leaps of faith, like some of the big ones, the ones where if we're really wrong, this whole thing falls apart. It just reframes how we think about the work we're doing. Yeah. And so once I then have an assumption, I would go back with my co-founder Christian and say, okay, how would we actually measure the success for this specific assumption? Right. You would figure out what's the right test. Because again, oftentimes when you do things off of assumptions, you start to realize that the work to prove or disprove the assumption is maybe less than you originally thought. Maybe we don't have to build that software feature and ship it in market. Maybe we could just do it on a clickable prototype and show it to 10 customers. Maybe we could, there may be faster ways to get to the answers and therefore our investment in improving something right or wrong is smaller, which I'm a big fan of. So it becomes more a question of what's the right assumption and then what's the right way of proving or disproving that assumption and maybe finding faster, cheaper, hackier ways of getting to those answers. Yeah. I think that's a very good advice. And I think that also highlights again, that it's not always about building the product, but there are so many different approaches on how to actually validate an assumption to then have a well-informed decision before building it. Yeah. In the book, I think it was Alistair that wrote this line. So I'm not going to take credit. He wrote most of the pithy, funny lines. I wrote all the boring lines, but I think he wrote the line that an MVP, a minimum viable product is not a product. It's the thing you build to figure out what product you're going to build. And I think it's, you can use that. The basic premise of that is a lot of people will build their MVP. Most people understand it's the minimum viable thing we build. They'll build that and they'll be like, well, that's it. We're in market. We've got our product. And our mindset was, no, no, no. It's just a thing you build that you're probably going to throw out. It's probably sacrificial, but you're going to use it to really figure out what people actually want. Because we can only do proxy tests for so long. You can only do interviews for so long. You can only do clickable prototypes or wireframes. At some point, you've got to put something in people's hands and see what the heck they do with it. That's the MVP. It's not necessarily the actual product. And so that becomes true for any product team. Anybody thinking about, oh, we're going to build a new feature. Do I have to build the feature right away? Or could I just do a one week or a couple of days sprint on something hackier and smaller? So it changes the mentality of how we go about building features. I would argue it could change the way we do marketing and sales. And it applies beyond just building product. Yeah. And I really like that because the MVP thinking is sometimes a little bit understood wrong in my opinion, because people always say, we built the MVP, we involve marketing, we involve sales, we launch it, etc. But as you said, the real product is for me more the V1 that you ship to production and to the customer, while the MVP is really taking a deeper look and understanding what people need instead of just saying, okay, this is now what we have. Because that's actually the second step instead of the first to understand what customers want and to identify the value you want to deliver. I think a fun exercise would really be to go out and look at how many MVPs are still live and keep running as products. Because another thing that I see a lot is people launch the MVP and once it's there, all the future plans are forgotten because there's different other priorities and you keep building the next MVP still with the same purpose, but then they stay out there and you have a very poor product in the long run. Yeah. And I think it's also, look, it's so easy. We've all done it where we've fallen in love with the thing we've built. It's very easy to do. Product people do it all the time. Designers do it all the time. Developers do it all the time. What developer have you ever met that says, the thing I love doing is writing code and then throwing out my code? Nobody likes doing that, right? I mean, developer now on this podcast that's going to comment or something and say, that's all I love doing. Most people do not like doing that. And so you get really attached and that's just human nature. And so it's hard to realize that the things you're building are built as experiments to learn something. And then you go from there. I think generally speaking, again, startups are learning this more and more. They're able to move quicker in bigger companies. This is much, much harder to understand, right? The notion that I would build something, put it in market, test it and not test it with the intent of just launching it no matter what, but actually testing it and saying, didn't work. We learned a bunch back to the drawing board. It's very difficult within large organizations to pull that off. So I think startups are getting better at it, of course, and certain individuals getting, but for bigger companies, it's even harder because they have just so much more infrastructure to have to do anything. So those cycle times are really hard to pull off. But also the infrastructure built around the KPIs that you have defined years ago, right? Yeah. And again, there's a lot of legacy infrastructure and things in place that make it difficult to iterate quickly enough. You have to get quite creative when you're inside of a larger organization. If you're a product person in a larger company or whatever the case may be, you have to get a lot more creative on how you can run experiments and do things. In some cases with big companies, they don't even want people talking to customers. In some cases, big companies, they actually sell to a distributor and the distributor has the, or retail, like think of retail. If you have a physical product that you sell into retail, you don't even have direct contact with the customer. So it just, the challenges grow significantly building new things inside of large companies. Again, startups have the same problem. And a lot of it is human nature, which is, I fall in love with stuff. I have to be right. I don't like being wrong. In bigger companies, I think it's also just the infrastructure in place that is not designed for rapid experimentation. Very nice. And I think we already, or you just mentioned also startups that are getting slowly better in the way they experiment and in the way they iterate. So maybe to wrap the whole conversation up, what would you say is the outlook or the future of data when it comes to the relevance for the different companies? Well, I think I really do believe that people are getting better and smarter about tracking things. And I think that's really true. I think that, again, the, there are more product people out there, more VCs, more CEOs in the startup landscape, talking about examples of things they did well, but also things they did poorly. People are much more transparent. And I think that's very positive because we can learn from others or take insights from others. So I think people are getting better and smarter about what to track, how to track it. I think they're moving faster in general as well and able to, again, the instrumentation tools are there. Everything is there to figure it out. Now it becomes just about the mindset and to some extent, the culture, right? How do I set it? If I'm the CEO, how do I set up a culture where everybody is committed to operating in focus on rapid experimentation, focus on tracking what matters, open to being wrong. So I think that becomes a cultural thing, but I think that's also just improving over time. I would give a lot of, not the credit, but the willingness to change is because you see other people doing it. And I think that influence, when you see successful founders or successful product leaders talking about the 40 experiments they ran to finally crack a nut, you're like, okay, if so-and-so is doing it, I probably should get on with it. So I think it is generally heading in the right direction. You do have to establish this culture of how you want to operate so that you can get everybody speaking the same language. Amazing. And I hope a lot of people are listening to this and think, yes, maybe we should tackle it similarly. Then we should start doing more experimentation. I hope you get as much inspired as I did when I read the book 2017. That's great. Thank you. I took a shower with my shampoo afterwards. Hopefully you didn't read it in the shower. That would be strange. To each their own. We're not here to judge how people read the content, so thank you very much. It was a pleasure having you. Ben, thanks a lot for joining us. It was very nice. So Christian, now that we finally managed to talk to Ben after your long shower reading sessions, was it as good as you were expecting it to be? It was definitely better because this whole cultural aspect that Ben mentioned and this change of mindset that is happening right now is something that I would say we all see, not only us, also our audience. But on the other hand, it's also good to just have the experts speaking and getting the deeper insights around analytics and also what to do with them. Because you can read thousands of books, but once you start working with it and once you exchange with people, you get new insights and new point of views. And that was definitely achieved today. How about you? Yeah, I think that's the thing, right? The theory is very interesting, but obviously hearing also how to apply it in different cases adds a lot more insight. And I think, is it metrics? Is it experimentation? And so on. It's very different from product to product and from company to company. So obviously there is no one rule fits all. But as you say, culture is very important. And one thing that I mentally noted for myself is obviously also, when we talked about experimentation and experiments in general, it's very true that oftentimes you fall in love with something and you start throwing things away. And I think one thing that I always try to do there is forcing myself, forcing also the people I work with to think of things in experiments. Because I do feel that if you approach it as, okay, I'm now building not an MVP, but I'm building an experiment because I want to validate it. Because MVP also already has this product in the world and it gives people a sense of product. But if you really tackle it with experiment and you keep it super low fidelity with the minimum effort that helps you learn something, knowing that you throw it away, that's just awesome. That makes it so much easier. And it doesn't hurt that much because once you start, you already know that you're going to eventually throw it away. But there was one important thing that you said, because now as you are back to a head position as a leader again, so you have to write mindset by saying, I'm going to do experiments. And the reason why you're doing this is because you want to deliver value. This mindset of understanding what value we want to deliver as a company and what problems we want to solve. I think that's the very first step to change the culture or establish a culture that is more focused on the user and also then be able to become more data-driven. I'm happy to hear your thoughts on that. Yeah, no, absolutely. I fully agree on that. In case you want to interact with Ben and follow up on the conversation, feel free to check out our episode page on product-bakery.com. Amazing. Have a good night. Bye.