Back to Episodes
Published: February 25, 2021

Researching customer problems - with Cindy Alvarez, Director Customer Research @Github

Published:February 25, 2021
Pixel Font:On
SummaryAfter studying psychology in the late '90s, Cindy started her product, research, and design career. She has lead many Product, Design, and Research Teams over the last decades. Next to that she wro
#31: Researching customer problems - with Cindy Alvarez, Director Customer Research @Github
00:00 / 47:12

Full Transcript

Welcome everyone to another episode of the Product Bakery Podcast. My name is Alex and I'm here today with my co-host Christian and we have a lovely guest from San Francisco, Cindy Alvarez. Hi, Cindy. Hello. Before we jump into the conversation, I quickly wanted to share that if you have any questions or feedback to this episode or to one of our previous episodes, feel free to head directly to our website and leave them there in the comment section. The comments are also accessible by our interview guests. We will make sure that if you have any questions, they will reach the right person. And besides that, you can obviously follow us on all our social media channels for the latest updates and more content. You can find all the relevant links in the description. Great. Christian, I hand it over to you to introduce Cindy. Cindy, first of all, it's a pleasure having you here today. And to give people a background what you are doing and why we are so excited talking to you today, I would like to quickly introduce you by starting with the University of Harvard, where you studied psychology. From there on, you worked and headed a couple of design teams, research teams and product teams for companies such as Yodlee, Lumia, Kissmetrics and Microsoft. In between, you have published the amazing book Lean Customer Development, which I am, by the way, a big fan of. And since more than a year, you're now working as director of customer research for the company GitHub. I believe many listeners know the company GitHub, but I'm really curious to understand what customer research at GitHub means. It is very interesting because developers are the people who make software. They're making things. They see every piece of the puzzle. And so to some degree, I imagine it's a little bit like when medical students go to the doctor and they go like, I'm learning this or maybe I already know this better than you. And so there's a very different approach to how you ask questions. I think in companies, I've worked in a lot of enterprise software companies and for a lot of information workers, you ask them questions and the only real hesitation you get is my employer be okay with me saying this. And I feel a lot of times like when you try to do research with developers, they're trying to research you right back. So that makes for a really fascinating interchange. But I also think it's so important because again, these are the people who are building things, engineers, product managers, all the stuff that we use is coming together because of these people and the tools that we're building for them. So it feels like a really great responsibility and one that I'm really excited to take on. I would like to better understand what is your day-to-day business in your current position and what are the things you try to understand and what methodologies do you use? Sure. So it's a pretty even split between kind of what I'll call day-to-day research and actually really trying to democratize that and helping everyone at GitHub do that. So you're building something, you want to get a prototype in front of some customers as quickly as possible. You've got some assumptions, you think to yourself, this doesn't have to be super rigorous, but we just would love to hear from five people outside of these walls. And so a lot of it is helping PMs, designers, engineers to ask those questions on their own, to just set them up so they're not asking biased questions. And so they're directing it to the appropriate audience. Now, there's some questions where you can throw something out on Twitter and you'll get a perfectly good, hey, this isn't us. We got a good counterbalance. And there are other things where you're really looking for a very specialist kind of participant. And if you're looking for an experienced security researcher to weigh in on something, you really don't want someone who's in week two of code school. And so for those, we do a lot of help of trying to recruit. And then the other side is really, what are the big risks two to three to six months out? And those are often things that I remember from my time as a product manager, you're incredibly busy. There's so many things going on and it's so hard to lift your head up and say, once this thing gets out the door, what are the bigger existential risks to it? And so that's an area where we really enjoy digging into is to say, okay, this thing works, but why would someone use it? Or why would someone not use it? Or why would someone want to use it but find them obstacles in their way? And so we do a lot of interviewing, watching people surveying to try and figure out those questions. I'm just a little bit confused because you mentioned shooting out questions on Twitter, because what I traditionally know is that user research and customer research, whatever it is, takes time. It's slow. It requires a lot of discussions and conversations. Can you maybe shed a little bit light on that? Sure. I think there's a lot of times when, honestly, if you are a software development house and you've got strong opinions about product, but as you part of being a founder is having a very strong sense of this is what the world needs. And so there's a lot of things that can't be built by committee. At the same time, when you've got that strong internal opinion, it is really healthy to just expose it to the air sometimes. And so it's not showing a prototype. It's just maybe saying, hey, we assume that the world does things this way. And then you can ask that. And if you immediately get back 10 responses of people who don't do it that way and think it's very strange, that's really a sign for more research and digging deeper into the question. If you shoot that question out and you immediately get people saying, yeah, of course this makes sense. So again, you might ask a little more, but it's just that finger to the wind. And so there's a lot of things where that can help you feel more confident or it can also shape a deeper research. I often say that it's hard to engage in a lot of research if you don't know the questions you should be asking. Sometimes that Twitter post reveals the questions that you should be asking. And that's when you can kind of bring in that more rigor and more time and saying, these are the people we need to be learning from. And this is how we should frame that question. But this is something that you would literally post something as GitHub? Or would you use like some sort of alias profile or maybe one of the developers who are influential and have a big reach to get the questions out of the door? Maybe even your name yourself. Yeah. A lot of us would tweet as ourselves. Occasionally people who follow Nat Friedman, Nat on Twitter, he tweets a lot of just random questions, but all of us might do that as well. We might post on LinkedIn or share in repos that some of us are contributors to. It's just, I think there's a big misconception that anything research related has to be rigorous. So I'm actually really glad that you brought that up. And I think it's possible to do very weak research and use it to confirm your own biases. And that is the thing that I see. But I think if you go in with the intent to try not to do that, it is often better to ask a few questions in a very casual way than to do no questions and to save it up until you have time to do research the right way. And I will often hear from people who say, we don't have time to research that now. And I'll say, you have time to ask a few questions and there is always time to fit it in. Right? Yeah, absolutely. It's interesting to hear the behind the scenes, right? Because especially when navigating Twitter, I definitely seen so many times like people asking questions, but they never thought about them as they being something professional or where someone actually tries to get. It's just like the way sometimes social media works, right? You shoot out questions, you start conversations and so on. And I love hearing that on the other side, actually, this is really initiatives where you try to figure something out and where you try to get some quick answers right to it. And I think I totally agree. It should never be like rigorous and it should never be slow. And there are so many techniques to also do fast or to get fast responses. And this can be community driven. We talked so much about communities in the last couple of weeks, but it can also be, yeah, I've known Reddit also as a very good space for bringing these questions and very interesting. Yeah. Again, I wouldn't take a Reddit thread or a Hacker News thread at face value, but the things that people are talking about and interested in, that's a signal of what people are curious about or what people are skeptical about. And so you can use that to shape research. And sometimes it's as simple as figuring out what kind of method would work. The other thing to speak to that like very lightweight research is using support as a means for asking a question. That's another thing that I always love to do. And I think if you've got something out there in the market, you've got someone writing to you for support. And I would say you always want to answer their question first, but now you've got someone who hopefully is happy because you just answered their question. There's a captive audience. It's such a great time to ask them something in return. And it doesn't have to be as formal as an interview. It doesn't have to be like the world's most carefully crafted question. It could be something like, how does this make your life better? Or what's the one thing you wish we would change? And those as very open-ended questions can give you a lot of insights into either what you should change or what you should ask next. And you also save a lot of time. Instead of always scheduling the customer meetings and going out and talking to them and preparing all this meeting notes and meetings and the post work as well, you can actually make use of your internal resources that you have anyway, which is customer support or eventually a sales team and get connected to customers. Yeah, absolutely. I mean, maybe a question here, because it sounds like you are definitely working with a lot of different people in the company. And you mentioned like the educational part for designers, PMs, helping them like asking the right question down to performing your own research to even leveraging customer support departments to get insights. So what would you say are the most important stakeholders or relationships like within the company that you have to build to get your research department in a good place? In every company that is different, because different companies are driven by an engineering culture or product culture, design culture. It's really who is that person who at the leadership team level at meetings that I'm not in, but who is that person who can speak up and say, look, this is crazy. We're doing this all wrong. That gives you a lot of light as to who controls things. So basically the person who can veto and in any given group is that the product managers, that the engineers, that the designer, and whoever has that veto power needs to understand why these questions are useful and what they're going to allow them to do. And I think one of the things that research teams sometimes have a challenge with is that if you haven't worked with this kind of customer development style research where you're really trying to build a community, trying to build a product with someone, you might think of research as being usability testing. Like, oh, the thing is done. Let's see if people can use it. That's really the lowest risk that any product has because usability flaws always exist and they're pretty fixable. You know, even in worst case scenario, you can do things like make the button bigger and make it red, which is not an elegant solution, but could be a workable solution. And I think we tend to overemphasize elegant solutions, by the way, like whenever we're in consumer mode, it's if the thing works, I'm happy. But what you really need to understand is what is someone actually doing? And I think I work closely with our data team and they're excellent. And also when you look at metrics, what you're seeing is what people do in the product and you're not seeing what's in the other tabs in their browser window. You're not seeing them flip back and forth or the papers they have or when we were in offices, the whiteboards that were up. And so that's really what you get out of customer interviews is having people talk you through the process and being able to stop and identify the parts where they didn't go into detail because they thought it was boring or they were slightly embarrassed. But actually the workaround that they're engaged in today is a huge opportunity space for you or your competitors to jump in and say, we're going to solve so that you don't have to have 12 browser tabs open and copy paste from one to the other. And it's things like that, realizing, wait, that is a massive time suck. It is something that's very error prone. It's maybe not a technical solution, like a highly technical solution, but it's something that can make someone's life much better. I would like to open a little bit the box of research, not of Pandora. When I read your book, I think it was like four years ago. It has completely changed everything because first of all, I understood how important it is to talk to customers. And secondly, to ask the right questions. Maybe we can go a little bit into the direction of what do companies need to do to start doing research in the right way? What are the common pitfalls and what are also your tips to make it as efficient as possible? So I know it's a bad question actually because I didn't question the one, but I would like to better understand how to start from scratch and how to make it as good as possible. Yes. So the first thing, the first pitfall that most companies fall into is not being ready to hear the answers. Because once you start talking to customers, they will answer in ways you don't expect and they will answer in ways that contradict your deeply held beliefs. And so I will often, with teams who haven't done this before, I'll lay down that as an expectation. Look, we're going to talk to customers. They're going to say a bunch of things that make us uncomfortable. You'll get used to it, which is true. You do. It's like you form a callus and you get used to it. But the other thing that goes along with that is often asking what kinds of evidence are we comfortable acting upon? And I think that can evolve over time. For many organizations who haven't done research, there may be a very high bar that something has to meet to say like the director of product says this. How many customers have to disagree and show evidence before we are ready to believe that? And I think that actually having that conversation in the abstract can be really helpful because then no one's personal feelings are at stake to just say, Hey, I know we have these beliefs. We have these hypotheses. Before we go out there and start talking to customers, it'd be good to understand what would we be willing to act upon? And I think at the other end, it's funny is that my research team will often turn down research projects because we don't think that they can be acted upon. People will sometimes come to us with a question and we'll say, we agree that sounds interesting. If we were to find out this answer, how could that impact the product? And if the answer is there's not going to be time to incorporate these findings or for whatever reason, we just have to build it anyways, then you should ship to learn. Put it out there. If something catastrophic happens, then we fix it then. Not catastrophic. If people are complaining, then you fix it then. But the point of this is really to make changes. It's really setting that expectation. We're going to hear things. They should be things that cause us to reconsider or reprioritize what we're doing. Yeah. I think that's a super important point because often you do research to simply validate your thoughts and not with the goal to reconsider. And it's just like a very different starting point also. And I would assume it helps you reducing bias also in the way you do research. Yeah. I mean, how do you look at this? The most valuable starting point is to be able to show people an example of bias in action. And this is actually easier than you think because in every situation, there's always something that we know about a customer. I put no in quotation marks. We know that this customer does this or feels this or has given us this answer. And to set up a situation where you engage with that customer, and usually I don't have a relationship, so it's okay for me to come in and ask naive questions. I can say, oh, hi, I'm new here. I know you've been over this ground with Christian before. Forgive me. I'm just going to ask again. You've said that you'd really love for us to add such and such feature capability. And the customer will, yes, of course you should do this. And be like, okay, look, since I'm new here, just to be sure I'm clear, what would that allow you to do? And the customer will talk about some problem that they have. Say, oh, okay. So if you were able to do that, what would the impact be? Or what's the workaround that you're doing today while we don't have it? And through that conversation, you'll see that sometimes a feature that seemed like it was a must have, there's not really a workaround for it, which means it's not a must have. If someone does not have a workaround in place, then it's not a deal breaker. It means that somehow they're surviving without it and without even any version of it. And so you should question that need. Sometimes, actually pretty frequently when people ask for a solution, it is a solution to a problem, but there's usually the engineers in the room are sitting on their hands because they can see that, oh my gosh, they have this problem and that is not the right solution at all. And so you have this conversation and then you thank the person, you hang up the phone, and then you have a little debrief and say, well, what do we learn here? And there's always this sense of, oh, we were so sure that this was the problem. We didn't understand. Here's why they were asking for that. It's not just a version of the five whys. Anyone who's been to business school would recognize this, but there's so many cases where a customer will say your product is too expensive. And then through questioning, you realize it's not too expensive. They're spending this money on other tools. What they're saying is the value I'm getting is not worth the price that you're charging. And so then the question is not lower your price, but how do we bring that value in line with price? And there are usually ways to do it. And in fact, customers will usually tell you what are the ways, like what do you need to do to get my money? It is amazing the degree to which people will tell you that if you just frame the question as in terms of their needs, what are you trying to do? And what are you doing if you realize, okay, here is a feature that is requested, which makes absolutely no sense for whatever reason. How do you say no to a customer? I wish I could say that you just lay it out logically and customers say, of course, because we know humans are not logical. And so I'd say you walk through that journey together. Again, it's customer development. You're not just saying yes or no, you're explaining why. And I'm not going to say that this always convinces people, but I will say that most people can accept to disagree and commit if they understand a why. They'll hear your why. They might say, I don't agree with you, but I can see that you put thought into this. So a customer says, I really need this feature. And you say, okay, interesting. Just to be sure I'm clear, what would that allow you to do? And they say, your competitor has it. And I say, that's true. They do. Have you used our competitor? Okay, no, you haven't. So I'm curious, if you were to switch to our competitor, we hope you don't, but if you were to switch to our competitor, what's the benefit you would get out of that feature? Oh, okay. So you don't know, pretending they're answering on the other side. And you say, who do you know who's using competitor and using that feature? Oh, I read about this company on Hacker News. And as the conversation progresses, you can see that they don't really have anything personal in this fight. Whereas if there's something that a customer really needs, and you say, tell me how it would make your life better. Tell me what you're not able to do. They have a story. They will say very clearly, oh, this is what I'm not able to do. Every time I do this thing, I have errors, and I have to go back and fix them. Every time this thing happens, there's a bunch of manual work. I'm not able to access this market. There's a very clear story. When there's not a clear story, and it's just this sort of nice to have, then you can say, I'd love to understand if there's anything this is really preventing you from. One of the things that we find as people building software is there's always more things that we're being asked for than what we can do. So our focus is really on solving the problems that are preventing customers like you from doing important things. So I'd love to know, what is this preventing you from doing? And if they really don't have something, say, I'm sure there are still gaps in our product today. What are the blockers? What do you wish you could do in this space? And somehow you can pivot people to know, actually, my biggest problem is this thing over here. So again, it's not like the customer throws up their hands and says, you're right, we don't need it after all. But they understand why you're saying no, or why you're choosing other things that would benefit them or other customers more. Now, curious just on the example that you made. So if you have a customer and you see that there is something that they really need to do and something where they really also have the story of a feature, for example, that would help them. And actually, the feature is there. So let's say, in this case, you covered some sort of a usability issue. If the customer doesn't know that it exists, but it is actually there, how would you then approach it like also in the conversation to like really figure out what to do next? So first, I'd want to understand what they think, what this desired feature would be. Because as soon as they see our version, you're not going to get that idealized view in their head. So okay, just to be clear, you'd love to do this. Okay, you wish you could do that. These are the constraints you're working within, say. And then you kind of say, I'd love to walk you through something that's in our product today. It may or may not meet your needs. And I usually say something a little bit self-deprecating about, clearly, we need to make this easier to find, because you don't want the customer to feel dumb. Because honestly, if they didn't find it, that is, in fact, our job. We have not somehow identified, put this in front of the customer. So then we might say, here's what we have today. It sounds like based on what you just told me, that you would use this in this sort of way. Where am I missing something? And I talk up in the book about the opinionated demo. That's what I'm doing here is instead of just letting them see it and kind of say, oh, you have that. I guess I'll poke around it later, which is everyone's first instinct. You actually force them to walk through and say, it sounds like you'd use it this way. Is that right? How would you put your data in here? And so what you want to do is vet that the feature you have is the feature they think it is. And there are often some gaps. Hopefully they're minor. But if not, it might be like, oh, I clicked into that and I immediately clicked out when I discovered that I couldn't X. And that can be really valuable. And so then in an ideal world, you've walked them through it. Maybe it does solve their needs. And so then you're great. I'm glad I could connect you with this. Look, as you play with it more, I'm sure we didn't get it 100% we wish we had, but there's always something. So I'd love it if you could provide any feedback on ways that it doesn't meet your needs or ways in which you would like it to be better. And then that person, there's a decent chance that you'll get a follow-up email from them talking about the things that they loved and the things that they didn't love so much. Now we are moving into testing and validating the solution. While at the beginning, we talked more about the problem. So to me, when it comes to the wording, I'm always a little bit confused between customer development, customer research versus user research. Where do you draw the line? So in my mind, user research, it's in the word. It's the person using the software who's not necessarily paying you, who doesn't necessarily have an allegiance to you. I feel like there are lots of products. If you look around your house, you'll see lots of products that you use, but that maybe if it disappeared and got replaced with something similar, you might not notice. If you're drinking out of a mug, really, unless it's your favorite mug, you could replace it with anything. And so in that angle too, it might be just about what you desire. So the things in your house are the things that you like. In a business use case, it's really not just about the user needs. It's also about what allows this to be a viable business. And so customer development is really focusing on, there are two sides that have to work here. We have to have a solution that solves customer problems, but it also needs to be something that we can sustainably build and support and continue to make better. Because in the long run, if there's a great product, but it's not sustainable, you're not going to get to keep using it. And so customer development is really looking at not just what someone says they want, their expressed desires, but what is the reality they exist within? What is the reality of the business cases you have? Anyone who's worked in markets that have multiple stakeholders and users can say that there's plenty of times when you might show software to say a teacher and they're like, yes, but I'm not the one who gets to choose the software I use. Someone in an enterprise, someone might prefer Google for work to Microsoft, but they don't get to make that choice. And so there's that balance of how do all these things fit together? And so solution is part of that, but really it's looking at who has the problem. And there's usually multiple who's, and you have to keep into consideration your own business. Can we do this? Should we do this? Are we able to be competitively better than the folks around us? Is this a business model that makes sense? And does it fit within the way that people are allowed to use and buy things? So it's looking at that whole spectrum. And within GitHub, is there a clear separation when it comes to research and who does like customer research and who does user research or is user research part of customer research? Just to understand a little bit because it's. Yeah. We use the word customer research just because I think it puts a face on. These are people who've chosen to use us. Many people are using GitHub for free. There's not any money coming out of their pockets, but they have chosen to put their trusting us with their code. They're trusting us with their building out your relationships in the developer community. And you can easily revoke that. And we want to remember that. So it's not just about can this user use this, which feels very abstract. Oh, it's a person out there. And I found it's easy to demonize that person. Oh, they don't get it. And no, these are all like we have all these incredibly smart people at different points along their developer journey, and they have all chosen to use our software or we want to convince them to choose to use more things within our software. And yeah, I just like to keep customer front and center is that word. The whole user research topic is quite big, but I see apart from talking to customers, another big challenge, at least that was something that, for example, Alex and I were facing when we were working together. The moment you have done your research, you go back with all the insights and all the learnings, but you still face sometimes people, founders, managers, or leaders who say, I want this button to be read, or I want this project to be stopped, or the utmost priority is now that. What are the tips you would recommend to product people and to researchers to deal with such situations? You know, our internal customers are customers too. So to the person who wants the button to be read, it's okay. I'd love to understand why do you want the button to be read? And there's usually a valid problem. It's, you know, I don't think customers are going to click on this thing. And then that kind of bifurcates to, you know, sometimes it's okay. So you're telling me that the absolute most important action on this page that we want people to take is this, and you think that people are not going to take it because they don't visually have a cue that is the most important thing. Right. That is a valid problem. I'm sure we can come up with an elegant design solution to that. Or there's, you know, it sounds like we really want customers to do this thing. I don't understand why they're motivated to do it. So we can make the button red. Yes. And some people may accidentally click on it, but I'm not sure that's the solution you're going for. It sounds like we really need customers to do this. And right now they're looking at this page and they're not motivated to, for some reason, they don't understand why they don't see any benefit to them. That's the problem that feels like it needs solving here. And faced with that, some people will just say, make it red. But most people will say, you're right. I want people to do this action. So how do we get people to take this action? And then they usually look at me. You're the researcher. How do we get people to take this action? Which is fine because that's a question that we can take on is to say, if we want people to do this, what would make them want to do it? Same thing with, I want this project stopped. Okay, great. We'd love to understand why, what are the concerns you have? I also say the same thing with managers who want to add work. And as a manager, I know I'm guilty of this as well as I will encourage my team to say, okay, I'm asking you to do this thing. That means something else has to give, because it's not realistic for me to constantly add things on top of the stack. But there must be things that we're doing that are no longer as valuable. And I find if someone comes to you with, we need to do this too. It's realistically, of all the things we have, can we reestablish priority? And sometimes it's not easy to say, what can we get rid of? But to say, okay, to be clear, these are the 12 things that are on my plate. It seems like you would prioritize them as one, two, three, four. Is that correct? Could you give me some information as to why? So you're basically just researching them as well. That's a very good point. And I think that was always something I never thought of, to really dig deeper and ask simply the question, why? Maybe five times even to understand what my boss or my manager wants. And it's tough because I think there's some environments in which our gut reaction to why is to think that someone is arguing and saying, I don't want to do this. And I think I always talk about the notion of verbal padding, which is that comforting sounding thing you say before the why, which is to communicate that you're not asking why because you're automatically saying no. It really is. I'd like to understand the purpose so that I can do a good job on this. You can tell me to do this and I can just go off and take my best guess, but it will be better work if I understand the underlying problem you're trying to solve. Which I think researchers do pretty well by nature, right? The way a researcher is also seen in the organization is that they are good at asking questions. And I think it's less of an attack because the why obviously often, like there's a lot of people who react defensive to it and then they start arguing and trying to fight for it instead of, again, just having a normal conversation around why they actually want to do it and what's the reason behind it. I think Christian, it was probably recently also where we talked about design critiques and non-stakeholders sharing design critique. I'm not sure if it wasn't one of our episodes, but we can dig it out. And I think like also there we were discussing, for example, if a stakeholder gives feedback to a design specifically to like really understand, okay, but what value would this add to the customer? For example, again, if we change a color, what do you think is the value that it would add to the customer if we do this change or if we implement this? Because also that brings people in this mindset of, okay, actually thinking deeper about, okay, or understanding better, why are they suggesting it? Is there actually something more than taste behind it when they share such a suggestion or what's the underlying problem? And I think this is super, super interesting. Yeah. And I think to your point about the difficulty of it, this is where a lot of like stock questions where you just establish this is the exact wording of the question and reuse it comes in really handy because then it's recognized as, oh, that's a thing that we ask. So for example, like what's the most important action we want someone to take on this page? I feel like that is, I'm trying to establish that as a stock question right now. And it's like when you hear someone else say it, it's recognized like, oh, that's that question that Cindy asked. That's a question that Max asked. It makes sense. I'm going to take it as a kind of a standard question as opposed to a personal attack. And so I will try to come up with questions that really work and just repeat that language over and over so that someone else can just use that language. And then when they do that, it's like they're wearing the armor of a researcher. And so everyone takes it in good faith in a way that might not be otherwise. I love the stock question idea. I think this is almost something that I would like to integrate in even something like a whole process. If you're like, okay, for everything we do, we need to answer this question. And it almost becomes like these culture company mission statements that everyone needs from top of their heads. Okay. What is the question that you need to answer? Okay. There we go. Thank you. Let's build it. I was just about to say that's a great way also to educate, right? More research thinking and asking more critical questions instead of just going blindly as most people do, unfortunately, into a solution or getting offended. So it's a very good tool to foster better communication when it comes to identifying problems on discussing solutions, in my opinion. Yeah. There's a phrase, fake it till you make it. And I think that absolutely applies here. You don't have to be super confident to ask a customer what's the most important thing you need to solve, or if you could wave a magic wand. And that is absolutely a stock question. It has been every place I've worked since writing the books. And it's super useful to break down a lack of communication as well. So with a customer, it breaks them out of this sense of, I have to ask for something that's reasonable and feasible. In workplace, I find there's oftentimes when two teams have colliding ideas, and they're both starting to get frustrated. And that's just a really powerful question to say, okay, wait, let's forget about how things work and structures. If you could wave a magic wand, what outcome would you want? And when you can skip past all of that detail and find that these two teams that are arguing actually want the same outcome, it's like, great, look, we actually want the same thing. Now, how do we work towards that? It's very, very productive. As a situation, as a manager, I know that sometimes people two levels under me, there's definitely a power dynamic. If I'm saying something, they may be a little hesitant to be like, I think Cindy's wrong. And that's where I can be like, okay, wait, magic wand, what do you actually want to happen here? And it's a way of freeing that. It's saying, forget for a second, do your best to forget that I'm the boss, but tell me what you're actually seeing. And it's very common that someone who is doing the on the ground research has better context and that my idea wouldn't actually work and that getting them to blurt that out in response to the magic wand question helps me say, oh, yeah, that's a great idea. Let me help you take that idea to reality. And I've heard it from many other people as well. And it always delights me when I hear someone else ask someone else, I'm just in a meeting eavesdropping and I hear the magic wand question because I know that what's going to happen next is real talk. And it's going to jump people over 10 minutes of arguing to wait, this is what we need to do. I would just jump on because like when Christian mentioned earlier on, like you coming back with a lot of insights from all the different research, because obviously there's a lot to it. I was immediately also thinking of, okay, one other problem that we had back in the days was how do you actually make all these insights accessible to everyone? Like how do you make sure that information that you gather, insights that you gather, answers, pain points, and so on, get to the right person or are stored in a way that someone who is at one point in the future working on a feature where an answer might be irrelevant can actually find it? That is the biggest problem in especially large company research is that someone somewhere has uncovered an insight and how do you do it? How do you get to it? And I think there's been various attempts made at creating comprehensive databases. Microsoft actually has one that's incredibly useful. It's incredibly detailed. It's also a lot of work to contribute to. So I think there's a balance there of when I look in that database, I see incredibly rigorous research that I feel very confident making assumptions based off of. On the other hand, if you have a culture like GitHub, someone is not going to fill out like a giant report and ever put it into that database, but they will submit an issue to our repo. So we have an internal repo and that's worked really well. So when someone learned something from a study, either on their own or working with their research partner, they'll write up a quick issue of basically, here's what we learned. And it might be a whole study or it might just be, hey, these are five really interesting points that we learned about computer science program administrators. And even if we don't know where they're useful today, we're sure they might be useful in the future. And so we'll tag it, we'll throw it in the repo. It's pretty easy to browse through there, search through there and find things. The other thing is that everyone who's worked for me will laugh that I make everyone write a TLDR at the top of anything, anything over a paragraph. You can have a full research report. You absolutely are welcome to have a beautiful presentation. I want there to be a TLDR with a five bullet points that were the most surprising or interesting, or that you think someone in the future, you know, should stumble upon. I always frame it as think about something that you would want in a meeting that you're not invited to. You would want someone else to bring up this point and bullet point those because you need to have these soundbites. It's like the stock question. If you have a soundbite of, hey, developers are frustrated at this. Maintainers want more stars, fewer open issues. And then you get these like little soundbites and people are like, yo, yeah, that makes sense. And sometimes it's enough to stop a meeting and say like, oh, wait, let's dig into that more. Sometimes it's just like a reminder of like, hey, we should go talk to the research team and remember why we have this soundbite in our head. I think any solution is very person centric. No matter how good the internet gets at categorizing information, I found that humans really just want to ask humans for recommendations on what they should read. So trying to make that really easy. Sindi, from your experience, since you have headed a couple of design research and also product teams, I would like to know from you as a closing question, what is the biggest mistake that leaders do that actually brings people into the situation that they're not collaborating in a synchronized and best way as a team? I think the mistake that leaders make is not explaining why. So when something comes down as an order from on high, everyone has their own interpretation of what it means and what their role is in that. Whereas if it becomes the outcome, yes, we're doing this. The reason we're doing it is this. If we don't do it, here's what will happen. Here's what success will look like. Here's why we're the people who should do it. These are some potential risks that might happen. We are willing to incur up to this much slowdown debt, additional cost to achieve this. Those are the questions that get debated after the director or the VP leaves the room. And the answers that teams come up with are not always the ones that have the full context. So I always think of it as if you're going to say something that is new or potentially unpopular, just think in your head, what would all the skeptic questions be? They'd be like, why are we doing this? Why is this good for me? How do we do this on top of my own job? Why don't we just let so-and-so do this? Why don't we do it this other way? It's like the more you can explain that why, the more you have empowered teams to do a good job on their own. And it's hard. It takes work. And I feel like the higher up you are in an organization, the harder it is to be casual and glib because people can misinterpret your words and you're like, oh, I didn't mean people to do this. But I think the starting with why is a big one. The other one is having some folks throughout your organization who you can trust to give you the real story. And I think that's sometimes there are people who are very senior in a specific area. Sometimes they're just people who are very blunt. But I feel like I've been that for leaders on occasion. And it's a really valuable thing to say, you said this. Here is what other people, and I'm not going to name names, but here is what I am hearing in the hallways. And some of these you may want to address. Some of them just know that they're happening. So those two things I think are tremendously useful. Great. Cindy, thank you very much. That was very insightful. I believe that there are a lot of actions that people can take after listening to you. Thank you so much for having me. Yeah, thank you. I think all the very practical examples are super helpful. So thanks for sharing them and have a beautiful rest of today, I would say. Thank you. Alex, what is the next topic you're going to research? Considering that I will start a new role with next week, I'm pretty sure there is a lot of research also internally to do. I mean, we talked about asking the right questions to stakeholders, to peers, and so on. I think, of course, it will be super important for me to understand the different goals internally, and then to get to know all the customers to put my focus onto that. But I think, as I mentioned already in the call earlier, Cindy gave some very good and practical examples, and I will definitely take a few of them and apply them immediately. With the highlight of the stock questions, and as I mentioned, I think it's a super good tool to even include it in a process and to processualize or to form a process of these stock questions. How about your new role? What are you going to do now? Yeah, with Monday, I'm leaving consulting after this one-year excursion. And I will be starting as head of design for Moonfair, which is a fintech here in Berlin. They are currently really scaling and growing their teams, and I think it will be super important for me to And pretty much what they're offering is they're trying to make private equity investments accessible to private investors, private individuals, and so on. I think, yeah, it would be a super interesting topic to research because we're talking about a very niche customer segment, right? Because obviously, we still take quite some time until private equity really becomes accessible to everyone. So for now, we're talking about high net worth individuals, and therefore, there will also be quite some work to find the right customers to talk to. Nice. What about you, Christian? What was your impression or what is your next action when it comes to research after this talk? Since I'm still in consulting, I'm sitting in many meetings where people believe that their opinion is the right one. So asking these questions of what do you think will change when we're going to make that button red instead of blue is something that helped me a lot, trying to better understand what the motivation behind such decisions is, is one thing that I took away from today. But on the other hand, what I really liked is the distinction between user research and customer research. And what I took away is that on one hand, you have understanding the underlying problem from the customer research side, while on the user research side, you really go into testing solutions and understanding if people would use it that way and if it solves the problem that you want to solve based on your hypothesis. That was something that was really actionable to me and that I would love to dig deeper on in the future. Awesome. I think with all these great guests and insights that we've gained in the past, we should really at one point try and connect all the dots from how to run the research, how to fit the research into the rest of the company. We heard about like education, so how do product managers and designers do their research? How do you combine it? Put the whole ops umbrella above it. Yeah, we should think about a good format on how to get that out into the world, because I think it's super useful. I think we should do what a couple of our people that we have interviewed done as well. We should write a book, Alex. Let's consider it, because we talked to quite some people in the lean tech space or in the lean series. Lean product making. No, okay. Let's not mislead people. Whenever you speak about such things in public, people think we're actually advertising some new book coming out. No, we're not writing a book. Not yet. It's not yet on our roadmap. On my personal, it is. Okay. Also, Christian might write a book about baking, hopefully. But yeah, obviously, we will keep thinking on how to share the most valuable insights to you all the time. And yeah, there's nothing left than saying, follow us to stay tuned. Exactly. And in case you have an idea, let us know. We're always open for feedback. Yes. Beautiful. Drop us a line, write us an email. Happy to do some shout outs also in the future. Hello at product-bakery.com. With that, have a great day, have a great night, have a great morning and talking to you soon again.

Play The Product Game

START GAME