Artwork

Sisällön tarjoaa Steve Portigal. Steve Portigal tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.
Player FM - Podcast-sovellus
Siirry offline-tilaan Player FM avulla!

44. Reggie Murphy of Zendesk (part 1)

44:37
 
Jaa
 

Manage episode 418110351 series 62327
Sisällön tarjoaa Steve Portigal. Steve Portigal tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

This episode of Dollars to Donuts features part 1 of my two-part conversation with Reggie Murphy of Zendesk. We talk about aligning the work of the research team with stakeholder OKRs and empowering non-researchers to do user research.

The researcher would go into these meetings and say we’re going to do a “I Wish I Knew” exercise, where we start thinking about what we’re building for our customers, what are the questions outstanding that we still don’t have an answer to. We’d go through that exercise, and then we’d prioritize that list. I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. You know, that question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one. And to see it all together was a revelation for some of our stakeholders. I can’t tell you how important that was. – Reggie Murphy

Show Links

Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with people who lead user research in their organization. I’m Steve Portigal.

In this episode, I’ve got part one of my two-part conversation with Reggie Murphy, who is the Senior Director of UX Research at Zendesk.

But before we get to that, when I went to revise Interviewing Users, I started with all that I had learned about teaching user research over the past 10 years. In fact, I’ve been leading user research training workshops for a lot longer, and that was a key source for the first edition way back in 2013. Across all this time, helping folks get better at user research has been a big part of my consulting practice. Sometimes I’ll teach a workshop as part of a conference, but mostly I run workshops for a company or a team within a company. Often the teams are interested in up-leveling skills, developing a shared language around user research, or just being more effective at customer interviews. Regardless of the experience and skill mix in your organization, I can lead a session that will help you learn and grow together. The people that get value from training with me include those with almost no experience with research, people who have responsibility but not a lot of grounding, and folks with a great deal of expertise. I typically find a mix in any group, and learning together in real time with me and with each other, whether it’s in person or remote, is an excellent way to improve your organization’s research practice. Please get in touch to find out more about how I can help you do research better and do better research.

I recently spoke about user research with Hannah Clark on the Product Manager podcast. I’ll link to the whole episode, but right now, here’s an excerpt.

Hannah Clark: Something I really wanted to make sure that we end off on before I let you go is bias. Because we always hear that interviewers should be as unbiased as possible. So how should we think about bias in a user interview context?

Steve: Right, and that word in our society, outside the domain we’re talking about today, it’s a really bad word, right? It talks about discrimination, racism, sort of everything that is screwing up our society is bias. So the word is a bad word, but I think when we talk about bias in interviews, we’re talking about cognitive bias.

And one thing is I’d encourage people to be a little more forgiving of themselves. It’s how our brains work, right? There’s reasons why human beings have these biases. Confirmation bias is where you hear what you expected to hear. You already had an idea, and so when you hear somebody say that, you’re like, “Yeah, see, I was right.” So that’s not good, right? You want to sort of do better at that. And there are tactics for this, like, “Hey, before we start doing research, let’s all talk about what our assumptions and expectations are, not as hypotheses to test, but just to say them out loud or write them down so that they’re not kind of clenched within our chest, but they’re just things that we can look at and like, ‘Oh yeah, this is a thing that might happen.'” And that sets you up a little bit better to see those biases when they come up and to let go of them, to have things confirmed, but also see something different.

But there is a compassion for ourselves that’s necessary. I want to offer just like a short story about my own encounter with my own bias. It’s kind of a story about me doing something not great, or at least feeling something not great and kind of overcoming it.

And it’s a story about going to interview small businesses and going into this agency, and the agency had the name of the founder on the wall, and I could go in. It’s like this creative environment, lots of fairly young, hip people kind of riding around. And I’m there to meet the founder, which I don’t know how I would get access to this person. And they come out, and they’re older than I am, and I was younger than I am now. And we go into this conference room, and we start talking, and I’m asking about goals and expectations and planning, all the stuff we want to understand about small businesses. And I realize at some point, this guy’s talking about his short-term goals and his long-term goals. And I realize that I am surprised by his articulation of long-term goals. And I realize that the reason I am surprised by it is that I had judged him. I mean, it’s my own ageism. I had decided this guy is sort of this figurehead founder who doesn’t really, is not really involved, and is like there to interview, not do the work. And it’s a terrible story that I had come up with about him based on what I brought into the room, my own biases, my own ageism. And so when I realized that, this is all happening in my head. Like, I’m asking questions, he’s giving me information. And I realize, oh, my questions are sort of based on my mental model, which is completely wrong.

And it doesn’t always have to be a horrible ism like this. We all have our mental models about people. And if we can hear them being wrong, then we can redirect. We can like, oh, tell me more about your long-term goals. And so as a researcher, yes, I wish I was a person who didn’t participate in isms and was not ageist myself. But we all are some amount to some extent. And I think, you know, there’s a very unkind way to have that manifest itself. And there’s just a more normal human level of that. And I don’t know that I can be the arbiter of what those levels are. But I’m not saying this to be proud of my ageism, but just to be kind of full disclosure with everybody.

When I had that moment, it was like really, really awesome. Like it was just such a great feeling of like it was even joyful, like, oh, I was wrong. I understand this person in a like a much deeper way. And actually to get over myself to be able to do that. My goal is to understand this person in a rich way. That’s what’s really exciting about research and what makes me be able to go get this stuff and kind of bring it back. And so again, they’re not always this extreme, but you often have to get over yourself a certain amount.

So I mean, what was happening for me was I could hear myself. I could hear where I was almost clenching, like trying to steer his story a certain way. And he was steering in another way. And just to feel that tension between what he wanted to tell me and what I wanted to talk about, which happens in every interview to a certain extent. And some of that’s just topic based. But here it was sort of identity based.

And so there was insight for me in that about myself, about the topic, about this guy. And I say that at the risk of being judged for my own bias, but with the hope that we can all get better at hearing our own bias and in the moment and kind of grappling with it and being intentional about the choices we make in the interview to get to what we’re trying to get to.

That’s from the Product Manager podcast. I hope you’ll check out the updated edition of Interviewing Users if you haven’t already. So let’s get to the first half of my conversation with Reggie Murphy, the Senior Director of UX Research at Zendesk.

Reggie, welcome to Dollars to Donuts. It’s so excellent to have you here.

Reggie Murphy: It’s great to be here, Steve. Thanks for the invite.

Steve: I like how we’re talking about here as if we’re in a physical place. We are just on a website talking to each other.

Reggie: We are in a virtual place.

Steve: Yes, I’m virtual here. First of all, what’s your title, your job role?

Reggie: I am Senior Director of UX Research at Zendesk. But for the past couple of months, I have been in an interim role as Head of Design for Zendesk also. So I basically have two jobs right now.

Steve: Just two.

Reggie: So pretty busy.

Steve: For people that don’t know, what is Zendesk?

Reggie: Zendesk is a complete customer service solution. So we help companies develop really positive and connected conversations with their customers through a system that allows them to get feedback from their customers to hear issues and concerns. And our software allows our customers to triage and prioritize in tickets. So our main software is a ticketing system. And so our customers and users can share what their concerns are, issues are, and the system will enable the agents, the customer service agents at our customers to resolve those issues and problems in an efficient way.

Steve: So the users, I guess we have users and we have customers.

Reggie: Right.

Steve: Just thinking from a user research perspective, who is it that you are learning about, what categories of individuals do you learn about in order to improve the product?

Reggie: So our primary customer, if we think about cohorts that we conduct research on, there are two. First, it’s the administrators. And that is the people who set up our platforms. And they could be so they’re like the director of CX, customer experience at an organization. They could be director of customer support. They could also be the chief information officer. So there is a variety of roles of folks who are responsible for purchasing our product and installing it within their organization. And then there is someone that we call admin or an administrator who sets it up so that their organization can use it. So that’s one cohort that we conduct research on to learn how that we best improve the product for them.

And then you have the customer service support agent. And that’s the title in some companies. And that is the person who is on the front lines talking with the end user. So their customer. So for example, Grubhub is a client is a customer. So their end user would be someone who say just ordered some food and maybe has an issue with their order and they go online and they use say the messaging system to alert Grubhub that, hey, I have an issue with my order. I didn’t get the right thing. So that’s the end user. And so the customer service agent at Grubhub is using our system to talk with that end user to help resolve that issue. And so we spend a lot of time researching the admins, how they are setting up the system and the agents and the interface that they’re using in order to talk with the customers that are using their product.

Steve: Do you ever deal with the end user of your clients’ customer support systems?

Reggie: Not a lot. It’s, you know, we have to make strategic decisions about the work that we do because of the size of our team.

We’d love to do more work with the end user. And we’ve done a few projects. But by and large, we spend the vast majority of our time speaking with admins and agents because we feel like that’s where we get the most bang for our buck and who we’re really building for. And I believe if we do the right thing and we build a thing right for the admins and the agents, then the end user will have a delightful experience also, hopefully. Yeah.

Steve: So you’re talking a little about, I guess, where you have to prioritize and what are the areas that you’re thinking about the most. Can you maybe say a little bit about that in terms of establishing research or building research? Where have you decided to look and where have you decided to focus?

Reggie: So when I first joined, and I talk about this in a podcast I recorded for Maze recently, where we were a team of 11 researchers. I have two research operations folks and I have two research managers. So a team of 16, including myself, relatively small, considering we partner with 200 plus scrum teams in the organization, in the broader product organization, which is about 1900 people. So relatively small team.

So when I first joined Zendesk, we were spread out throughout the organization. We were getting work done, but most of the majority of the work was primarily tactical. And we were working towards the end of the product development life cycle, sort of in the evaluation phase. Okay, we’ve already built the thing. Now let’s go and talk to customers and see if it works. And great. We needed to do that type of research, but I believe just philosophically that UX research teams, especially one as small as the one I had inherited when I joined, we needed to be working more towards the beginning of the product development life cycle. Because I thought that that’s where, I think that that’s where the most value can be extracted from a user research team.

Over the past couple of years, we’ve been shifting our work to achieve that goal. And in doing so, the way we’ve done it is we have aligned the work that we do to the company’s OKRs. So each year, a company has about six or seven big OKRs. And then those translate into other OKRs at the organization level. And we align to those big OKRs. And so I’ve spent the last couple of years orienting our team around that structure. And it’s made our ability to conduct more strategic research at the beginning of the product development life cycle better, more efficient, and we’re adding more value.

We still, of course, do tactical research. We still do usability testing and concept testing. Should we go with concept A or B? We still do that. But we’ve seen better success now that we are oriented in this way. And I believe that we’re going to continue as our team, hopefully we get bigger, but let’s say, assume that we stay the same size for the next year or two. I believe this helps us work on the most important work at the company. Now when we do that, obviously there’s some research that still needs to be done that maybe our team can’t do themselves. And that’s what I talked about in the Maze podcast.

And so over the past couple of years, we’ve undergone a pretty extensive effort to democratize some of our research methods. Obviously you can’t, you know, someone who’s not a professional researcher, there is certain types of research that we believe that we can enable them to do. And then there’s other types of research, the more extensive, deep research that we believe that, hey, let us handle it. And so we’ve developed a pretty extensive plan that I’ll talk about in that podcast to help designers or product managers who we may not be able to partner with very closely on their particular area. We’ve enabled them to do very basic user testing using the usertesting.com platform or just basic interviews just to get some feedback or insight for a particular customer problem that they’re trying to understand.

So I think we’ve been pretty successful developing that system over the past couple of years so that we have some balance now. Our researchers are doing really valuable strategic work, foundational research, some tactical research, and we’ve enabled other functions, not researchers, to do research that we aren’t able to help them with on a day-to-day basis.

Steve: I will put a link to that interview in the show notes.This idea of giving other folks tools to do certain kinds of research, democratization, I’ve heard people refer to it as the D-word. It feels like a controversial area. Just having worked on this for a period of time and had success, do you have a hot take on the hot take, I guess, about democratization?

Reggie: I’ve read a lot of hot takes on this lately. I do. You and I have been in this business for quite some time. And those of us who have many years of working in different companies and different settings, I’m a little old school, so I believe that, hey, at one point, let me just say at one point of my career, only researchers should do research. I took a hard line on that.

But when you work with other functions in the way that I’ve worked across my time at Facebook, well now Meta and X, formerly Twitter, and some other companies, you learn that there are other functions like some designers and PMs who also have research experience that if you were able to enable them a little bit and provide them some guidance and the right tools and set the right boundaries and parameters for how they structure a research program, that they can do valid research that they can be confident in.

At one point, yeah, I admit, Steve, it made me cringe that I would allow someone to come in and do a usability test without me or without the team. But I’ve backed away from that in that because we’re now in a world where we have limited resources, headcount, we have constraints on time, we need to build and ship. And so we have to be flexible and adapt. And I think over the years, I have begun to really understand what that means in terms of delegating research. And now I don’t cringe as much because I am working with very capable, very intelligent people who are open and understanding that, “Hey, look, I’m not a researcher, so please help us. We really need assistance here.” And so at Zendesk, we’ve set up a really solid model where we can be confident in the other research work that other people are doing.

So I don’t know if that’s a hot take, that’s a meandering answer, but I no longer have that firm hardline stance that I did back in the day because now I have worked long enough with cross-functional stakeholders that just want to do the right thing. They’re not going, you know, maybe there’s some rogues here and there, but I’m working with people who really want to learn and understand customers and they’re willing to listen and learn and be educated by a research team in order to do it and to do it well so that they can be confident in the outcome.

Steve: And just to affirm the deep roots of that cringe,the fear of what other people will do, way way back, I remember working at an agency doing in context research. And our clients would occasionally ask to show up and sit in. And so we’re not even talking about leading or have any ownership of or any responsibility for, but just be present during. And that was a horrifying concept, that just the mere presence of somebody, they would disrupt this elegant, finely nuanced dynamic. And I think we kind of started to compromise and we would stage this. We talked about re-interviewing somebody or finding a friend of a friend, recruiting something so that we knew it wouldn’t screw up. In a consulting situation, again, early days of the work, so it wasn’t maybe well understood.

Reggie: Right, exactly.

Steve: Just the sense that this was fragile and could be ruined, and that would hurt the work and hurt us and hurt the relationship. I think the roots of what you’re talking about go back and back and back. And when we say them now, it’s ridiculous. What I’m saying sounds ridiculous. How could you ever prevent people from coming or not want them to come? Now we want the opposite, right?

Reggie: I remember some of those days, but I think in the second company I worked for, which was a media company, the Gannett company that owned USA Today, maybe eight or nine years into my career there, we began to do more ethnographic research. This is when I was doing a lot of training with the IDEO company and design thinking was the thing at that moment.

And this is why my stance on this softened because we spent a lot of time in the field with other non-researchers. We would intentionally invite them so that they could see for themselves what customers were saying. We were tired of going out and conducting a piece of research, bringing it back, and people not believing us. You know, because those are questions that come up, “Well, what did they say?” And there was skepticism.

And as much as we tried to convey the value of the work and do all the things to make people feel confident, you know, there was always that bit of skepticism. So we began to empower and engage our stakeholders and invite them to infield assignments. And that’s when I believe that I saw it from a different perspective. I saw some genuine, sincere desire to sit and listen. And we would tell them, we had a sheet and I still have some of these administrative protocol sheets where it’s stakeholder, basically how to be a good stakeholder 101. Here’s what you do. Here’s what you don’t do. Just sit and listen. Don’t say anything. So after a while though, we would allow them to probe, ask questions because we would educate them on how to do it.

Now for those who are still on that hard line today, who would cringe at me even suggesting that we do that, I would say that, yeah, there are probably some cross-functional partners who may not be the type of person who would do very well with this responsibility if you give it to them to do it solo. So then what do you do? Well, at Zendesk, we have sort of a consultancy model. So if you’re a PM, product manager, and you have something that you want to, you have a question, you don’t have a research partner. We have a research request process that we triage these requests. And then we set up time, office hours, time with a researcher, and we’ll consult with you to understand what is the problem that you’re trying to explore and understand. And given the level of urgency or what we believe to be the level of understanding that you need, the precision that you need to have to have a really successful outcome, we’ll decide I think you can do this on your own. Here are some tools. You can go into our learning management platform. You can do our certification course and here are the tools, go for it. Let us know if you have any problems.

That’s one route. The other route is after that consultation, maybe we don’t feel confident and maybe the person who’s asking for help may not feel as confident. And there we can make an informed decision about maybe we go into the direction of, “Hey, we’ll consult with you along the way. It’s not like you’re flying blind.” That gives us, the research team, confidence and comfort that we’re doing the right thing and we’re not sacrificing integrity or quality of the work and the outcomes are what we want or what we envision them or expect them to be from the research.

Steve: I’m having a small, if not big, aha from that. Some of the discussion around democratization is, yeah, look at the nature of the work, the urgency, the complexity. Sometimes it’s around the method. But I think if I’m hearing properly, you’re also saying, look at the team, the requester, and their capability. Educatability is maybe not the right word, but you want the work to be successful and who’s gonna execute it is a factor there. So you choose how to respond. That’s something that you factor in in choosing what path to recommend.

Reggie: Absolutely. I have another example to give you. Recently our company had its biggest customer event. It’s called Relate. It was in Las Vegas this year. The product team took a lot of people to the event this year, which hadn’t really been done previously at an event like this where you have 1,500 customers. And the research team, the design team, we were all there and we set up an area where customers could come and talk with us about various product ideas that we were launching at the event, but also to help us look into the future. We called it co-designing the future of Zendesk. And it was just three researchers, well, four researchers, including myself at the event. We had designers and the remainder were product managers.

So in order for us to be successful at these customer conversations, we developed sort of some tools, some guidance that we shared with the organization. Say, “Hey, if you are speaking with a customer, here are some questions that you can ask them and here’s how you can ask them.” And we enabled them to, I believe, have some really fruitful, enriching conversations. And I felt good about that. I felt good about what we learned. And I think as you grow and mature in your career as a people leader, which I am, but even as a UX researcher, you begin to develop sort of this level of, okay, it can’t be all bad. Like there are ways that you can adapt and get the type of outcome that you expect. Is every piece of research that a non-researcher will do will be perfect? No. Will they make mistakes? Yes. But on balance, I believe that if you set up the right parameters, provide the right guidance and consultation, then you can really get the outcome that you want. I think that we did that at this conference this year.

Steve: did you see some of your non-researcher colleagues being successful in their interactions with customers?

Reggie: It was amazing to see. They were engaging, they were listening, probing in different places appropriately. Because we were sharing some concepts on a big corkboard wall. And to be a good researcher, and I think I made the comment in the Mace podcast that everybody can be a researcher. I’ve read lots of articles where I say, “Everybody can be a researcher.” I truly believe that any time that you are in front of a customer, no matter who you are at the company at Zendesk. So let’s just say Zendesk, I’m just talking about Zendesk right now. Any employee, no matter your title, anytime you are in front of a customer, you can be a researcher. You have to be a researcher, really. Because that’s how you learn what that customer is trying to convey to you. And I was so proud of us and the guidance that we gave them. But in watching how we interacted with customers, you would have thought it was a team full of researchers. And I’m not going to take all the credit for it, but because I think we were just working with, I have an incredible team, teammates, designers and product managers at the company. In order to have a really solid conversation with a user or a customer, in our case, by and large, all you have to do is really listen and ask why and how. And I think, you know what I’m talking about, Steve. You ask those questions that can really help move the conversation along so that you can get to the understanding of what that customer’s need may be. Right.

Steve: It’s simple and easy and really hard all at the same time. Maybe that’s why as a field we have kind of fraught conversations about this. Everyone is a researcher is, I mean, it’s a good hot take because I think it’s a distressing comment and what do we mean by research? What do we mean by researcher? I think you’ve provided the context, but that little pull quote, and I promise not to just pull that out of context on you. But that whole quote, it distresses people.

Reggie: Yeah.

Steve: But I think you’re framing it in a really positive way. And you’re right, it makes me wonder, and I am certainly guilty of this, collecting stories of when we see things go wrong. I mean, I like those because they’re good learning moments, but

Reggie: Absolutely.

Steve: I think there is a tendency or risk of focusing on this narrative that this is hard and people can’t do it. But when you just look on the other side of the mirror like I think you’re doing and saying like, this is hard and people can do it. It unlocks a lot more potential for the outcomes that we all want, the kind of information that we’re looking for.

Reggie: If I can say one more thing about this. What I’ve enjoyed about this process of enabling our cross-functional partners to do research, empowering them to do it, is that it makes them better product managers. It makes them better designers because now they have the customer in mind. And I can’t tell you how many organizations or teams I’ve been on where that wasn’t the case. And I’ve had to kind of sort of jump in the middle of conversation and say, “Hey, wait, has anybody asked the customer what they think about what we’re doing here?” And so I love that the organization that I’m in are thirsty for these opportunities to, number one, talk to customers, but number two, to learn from a team of expert researchers on how to do it. Because in the end, it’s going to make them better.

And I feel very strongly about that, that, hey, we’re not where we want to be, but I think our entire organization is on a really nice journey up the maturity curve on becoming more customer first and understanding the customer better. And these opportunities like the Relate event and just other projects that we’re conducting internally, I think are really good ways for our teammates across functions to learn, grow in their own careers and to be better at understanding how to gather information to make more informed decisions about what we are creating, building and shipping for our customers.

Steve: Maybe I can use that as a segue to go back to something you said quite early on in our conversation. One of the areas where you’ve been pushing to build that maturity, and you talked about shifting where research is focused from kind of evaluative late in the product cycle to more strategic and other things that are taking place earlier in the cycle. And you talked about kind of realigning. And you said a little bit at a high level about how OKRs were kind of the unlock, I guess, to kind of get in there. And because I think you’re describing a very common challenge that maybe less mature organizations face or research leaders face or research teams of one.

I’m only allowed to do this work. No one wants this other work for me. I keep recommending it. And it’s sort of that I think some people are kind of stuck in these little traps where they can’t change the perception of what value they can bring. And therefore, they can’t get the opportunities to bring that value. So could you unpack a little bit more about how you changed what the opportunity was for research at Zendesk?

Reggie: There were a couple of things that I did. First of all, the first six months, I just was assessing what our relationships were like with our key stakeholders. And I came to find out that, you know, while we had some syncs with them, they weren’t necessarily on a cadence. When I say syncs, one-on-one meetings with our design leaders and product leaders and engineering leaders. So, number one, we need to set up a consistent cadence where we are connecting with the leadership of these particular work streams on a consistent basis.

Then we would put ourselves, make sure that we are positioned appropriately in the important conversations that are happening earlier in the product development lifecycle. When the customer problem is being talked about, or what we think are the customer problems, researchers who are in that moment, and they’re in those meetings early on, are able to help that conversation understand, well, how do we know that this is a problem, number one? And I think it was very important at the very beginning to establish a clear and consistent cadence of syncing with other cross-functional stakeholders. So that was the very first thing, and it was developing that relationship. So I think the relationship that we had, and I might be exaggerating, was we have an idea, it’s pretty baked, research, come on in. Let’s tell you about this idea. Now let’s go set up a research program to test it. I’m not saying all the work was like that, but that’s what I observed. I said, we got to fix that. So that was the first, I think, thing that I really tried to do was develop a closer relationship.

Steve: So a very sort of tactical thing, you’re setting up a cadence of one-on-one relationships with the people in these other parts of the organization. And one of the outcomes there is to get into those meetings where a researcher can identify assumptions and challenges and so on. Are those meetings where those conversations are happening, are those literally the one-on-ones you’re setting up?

Reggie: Well, they were the workstream leads meetings. So there’s scrum teams all over, and they’re small, medium, and large sizes. But there were meetings where the design leadership, the product manager, the product marketing manager, maybe there was a data science leader. That core team that is working on that thing, I wanted to make sure that not only were we in those meetings at a consistent cadence, but that we were developing one-on-one relationships with everybody in that group. So the relationship is one thing where you’re doing a one-on-one, and you’re just learning about that person, and you’re really understanding what that person needs and wants.

But from a team level, now you are in the thick of it, and you are in those conversations, like you said, where the customer problem is talked about from the very start. And in addition to that, we installed some protocols or exercises or activities where the researcher will facilitate a conversation around what are the potential customer problems. We call it I Wish I Knew. And this was something that I learned about when I was working at the company formerly known as Twitter. And it was an exercise. It was essentially a brainstorming exercise.

And when I first joined Zendesk, it was right around May and June. And then I guess right around September, October, we started annual planning for the next year. And that was a perfect time to install this activity. So we piloted it in a couple of work streams. This was a moment where the researcher could go into these meetings and say, “This meeting, we’re going to do an IWIK exercise, I Wish I Knew exercise, where as we start thinking about annual planning and the products that we were thinking about building for our customers, what are the questions outstanding that we still don’t have an answer to or areas that we still need to explore?” And so we’d go through that exercise, and then we’d prioritize that list that they would come up with.

I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. Because especially once you get into the prioritization work, you look and you go, “That question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one.” And to see it all together was sort of a revelation for some of our stakeholders. So when we installed that exercise, it changed the game a bit in terms of our relationship and our position or brand inside the minds of our cross-functional partners, because now we’re providing a value at the very beginning of the product development lifecycle that is helping establish, “Okay, so these are the problems that we need to be worrying about as we think about planning for the next year.” Can’t tell you how important that was.

And then, so if I think about the next part of this was as the company started looking at their annual plan and defining these big buckets of areas that we wanted to work in. So for example, intelligent conversations is sort of one of them, just throwing that out there. And this is all of our sort of artificial intelligence work. And we think about within that big bucket, there’s like maybe, I don’t know, I’m just picking this up, there’s 15 projects or 15 streams of work that need to happen. And we strategically aligned a researcher to that category of work to work with those teams to figure out, well, where can research lean into the most to have the most impact in that work stream that laddered up to that OKR. And when a researcher is in that position and they’re reporting out research, and at the very first couple of slides, this is the research project that is designed to learn 1, 2, 3, A, B, C that will help inform OKR XYZ. That is how you operate. That is how you become a valuable research team. It’s not perfect yet. I think we still have room to grow and get better at it. But over the first couple of years, I think we’re doing a really good job at setting up a structure where we are able to add that kind of value to our product development organization.

Steve: Can I play back some of what I heard?

Reggie: Yes.

Steve: Because I think this is a universal complaint and concern, right? Not getting access, not getting permission. You talked about the individual relationships with people and as part of that, being in those meetings, being in those team meetings. I think it’s on a case-by-case basis, right? How do you get invited to that? But there’s an audacity a little bit of showing up and saying, “Hey, we’re here to be part of this meeting,” then making a contribution. But I think maybe biding your time a little bit. I’m now putting words in your mouth. But building the relationships, being in the meetings, hearing what’s happening, and then choosing a point at which to bring value. And the value that you’re bringing there is facilitative of others. It’s not, “Here’s what we know. Oh, we have the answers.” But just, “Let’s help you talk about what you have to decide and learn.”

Reggie: You’re onto something. So yes, we’re doing that. But in addition, we are bringing some answers to that meeting because one of the other things I did in the first year is we established a research and insights library, a research repository. So we aggregated all the UX research that had been done to date at the company, and we put it in our confluence site, and we set up a nice searching function. So literally every research project that we’ve done that we could find is there. And so in those meetings, when we are having conversations about different customer problems that may not have been answered, we may be able to answer some of them because we now can point to this library and we say, “Hey, wait a second. A couple of years ago, we explored this problem at this angle. So the question that you’re asking, we’ve already covered that piece of it, but maybe we can explore it in this way, in a new way.” And so I don’t know if I’m countering what your assertion was earlier, but I’m basically saying yes, we are able to bring some answers to that meeting for the purpose of helping triage the priority or prioritize the list of questions that come out in the brainstorming of the I Wish I Knew brainstorm session.

Steve: So you are countering it, which was the outcome of me doing, like, “Here’s what I think I heard” question for you. And then I guess the thing that I took away from you sort of describing that presentation where a researcher is part of a work stream and says, “We learned these things. It affects these projects.” I think you said, like, in the first couple of slides, we talk about things like oh, speaking the language of business. Sometimes people talk about you got to know, like, the business model of the company. But I think you’re giving a very actionable definition of what it means to speak the language of business, which is to know what initiatives or what projects or what OKRs to be able to frame the information you’re bringing in terms of what things it ties to that are already what people are concerned about. So doing that translation, so, like, why this is important.

Reggie: Absolutely, because it’s going to ladder up and inform this OKR. Now what I want to mention is this is how we set up our structure to help the researchers align directly with valuable work and strategic work that they can do. But in addition to that, we are also pushing ourselves to even look beyond that specific work and look horizontally. And that has really galvanized our team.

For example, I mean, adoption. If you’re at a B2B company, adoption is a huge priority. So not only are we doing these discrete projects that are laddering up to the OKRs, we’re also stepping back and looking across the organization and saying, okay, well, out of all the projects that we’re working on, what are we learning about customer adoption in all of them? And how can we now inform how the company is thinking about how we help our customers adopt our product and extend it and upgrade, add more? Like what do they need? And so I’m so excited by this work that we’ve recently done around adoption. And we’re seeing a lot of our cross-functional partners leaning in on it and asking us for it. And we’re trying to do a little bit more. But the reason why I’m mentioning this, I just want to sort of set the structure of how we’re maturing as a team is that, yes, we’re aligned to these OKRs.

And it’s actually freeing us up to do the strategic work, but also to pause, step back a little bit and even look higher than that. And I just can’t wait to see what we do for the rest of this year and next.

Steve: What’s the mechanism for different researchers working on different projects? They have different whatever, it’s documents, it’s in their heads. How do you collectively find those overarching issues and kind of pull them together?

Reggie: You know, with the adoption work, it kind of organically happened. We do have check-ins every quarter where our research team, we get together and we do what we call research readouts for that quarter. And this is where a researcher will share all the projects that they’re working on, what they intend to learn. Maybe some of the projects are already in flight, so they’ll give a status update. And in that moment, the purpose of those meetings is to find opportunities where we can collaborate and work together on different projects that are similar. And organically, that’s when these opportunities surface. And I think that’s what happened with the adoption work. There were several projects around adoption. And oftentimes, even with a team as small as ours, that you may not know everything that your research colleague is working on. And that’s why we do these quarterly syncs where they read out what they’re working on. It’s super-duper helpful for us being able to do this.

And that’s what happened. It just sort of organically happened. But I think we’re trying to be more intentional on finding opportunities like that, where out of all the streams of research that we’re conducting, what are we learning that is sort of a global insight that is meaty enough that we can highlight it as a thing that the company needs to lean in on and care about? That to me is where value is from a UX research team, even as small, medium, and large, no matter what size you– if you’re doing that on a consistent basis, senior leadership, executive leadership, they pay attention. And I’ve seen it. And I think what insight has done to us, we’re starting to gain this traction and awareness that we’re doing this kind of work. And I love it. And I encourage anyone who’s listening to this podcast to start thinking in this way, because you can get so buried into the day-to-day research. And you really want to add value to that work stream. And you really want to do well and be successful. But you also have to look ahead. And you also have to look beyond it a little bit. And when you do so, you can find opportunities like that to tackle issues that you may not see in just one particular, one discrete project.

Steve: And that’s the way it is, at least for part one. Stay tuned to this donut channel for part two with Reggie coming soon. A reminder that you can always find Dollars to Donuts at all the podcasty places. Plus, you can visit portigal.com/podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

The post 44. Reggie Murphy of Zendesk (part 1) first appeared on Portigal Consulting.
  continue reading

60 jaksoa

Artwork

44. Reggie Murphy of Zendesk (part 1)

Dollars to Donuts

208 subscribers

published

iconJaa
 
Manage episode 418110351 series 62327
Sisällön tarjoaa Steve Portigal. Steve Portigal tai sen podcast-alustan kumppani lataa ja toimittaa kaiken podcast-sisällön, mukaan lukien jaksot, grafiikat ja podcast-kuvaukset. Jos uskot jonkun käyttävän tekijänoikeudella suojattua teostasi ilman lupaasi, voit seurata tässä https://fi.player.fm/legal kuvattua prosessia.

This episode of Dollars to Donuts features part 1 of my two-part conversation with Reggie Murphy of Zendesk. We talk about aligning the work of the research team with stakeholder OKRs and empowering non-researchers to do user research.

The researcher would go into these meetings and say we’re going to do a “I Wish I Knew” exercise, where we start thinking about what we’re building for our customers, what are the questions outstanding that we still don’t have an answer to. We’d go through that exercise, and then we’d prioritize that list. I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. You know, that question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one. And to see it all together was a revelation for some of our stakeholders. I can’t tell you how important that was. – Reggie Murphy

Show Links

Help other people find Dollars to Donuts by leaving a review on Apple Podcasts.

Transcript

Steve Portigal: Welcome to Dollars to Donuts, the podcast where I talk with people who lead user research in their organization. I’m Steve Portigal.

In this episode, I’ve got part one of my two-part conversation with Reggie Murphy, who is the Senior Director of UX Research at Zendesk.

But before we get to that, when I went to revise Interviewing Users, I started with all that I had learned about teaching user research over the past 10 years. In fact, I’ve been leading user research training workshops for a lot longer, and that was a key source for the first edition way back in 2013. Across all this time, helping folks get better at user research has been a big part of my consulting practice. Sometimes I’ll teach a workshop as part of a conference, but mostly I run workshops for a company or a team within a company. Often the teams are interested in up-leveling skills, developing a shared language around user research, or just being more effective at customer interviews. Regardless of the experience and skill mix in your organization, I can lead a session that will help you learn and grow together. The people that get value from training with me include those with almost no experience with research, people who have responsibility but not a lot of grounding, and folks with a great deal of expertise. I typically find a mix in any group, and learning together in real time with me and with each other, whether it’s in person or remote, is an excellent way to improve your organization’s research practice. Please get in touch to find out more about how I can help you do research better and do better research.

I recently spoke about user research with Hannah Clark on the Product Manager podcast. I’ll link to the whole episode, but right now, here’s an excerpt.

Hannah Clark: Something I really wanted to make sure that we end off on before I let you go is bias. Because we always hear that interviewers should be as unbiased as possible. So how should we think about bias in a user interview context?

Steve: Right, and that word in our society, outside the domain we’re talking about today, it’s a really bad word, right? It talks about discrimination, racism, sort of everything that is screwing up our society is bias. So the word is a bad word, but I think when we talk about bias in interviews, we’re talking about cognitive bias.

And one thing is I’d encourage people to be a little more forgiving of themselves. It’s how our brains work, right? There’s reasons why human beings have these biases. Confirmation bias is where you hear what you expected to hear. You already had an idea, and so when you hear somebody say that, you’re like, “Yeah, see, I was right.” So that’s not good, right? You want to sort of do better at that. And there are tactics for this, like, “Hey, before we start doing research, let’s all talk about what our assumptions and expectations are, not as hypotheses to test, but just to say them out loud or write them down so that they’re not kind of clenched within our chest, but they’re just things that we can look at and like, ‘Oh yeah, this is a thing that might happen.'” And that sets you up a little bit better to see those biases when they come up and to let go of them, to have things confirmed, but also see something different.

But there is a compassion for ourselves that’s necessary. I want to offer just like a short story about my own encounter with my own bias. It’s kind of a story about me doing something not great, or at least feeling something not great and kind of overcoming it.

And it’s a story about going to interview small businesses and going into this agency, and the agency had the name of the founder on the wall, and I could go in. It’s like this creative environment, lots of fairly young, hip people kind of riding around. And I’m there to meet the founder, which I don’t know how I would get access to this person. And they come out, and they’re older than I am, and I was younger than I am now. And we go into this conference room, and we start talking, and I’m asking about goals and expectations and planning, all the stuff we want to understand about small businesses. And I realize at some point, this guy’s talking about his short-term goals and his long-term goals. And I realize that I am surprised by his articulation of long-term goals. And I realize that the reason I am surprised by it is that I had judged him. I mean, it’s my own ageism. I had decided this guy is sort of this figurehead founder who doesn’t really, is not really involved, and is like there to interview, not do the work. And it’s a terrible story that I had come up with about him based on what I brought into the room, my own biases, my own ageism. And so when I realized that, this is all happening in my head. Like, I’m asking questions, he’s giving me information. And I realize, oh, my questions are sort of based on my mental model, which is completely wrong.

And it doesn’t always have to be a horrible ism like this. We all have our mental models about people. And if we can hear them being wrong, then we can redirect. We can like, oh, tell me more about your long-term goals. And so as a researcher, yes, I wish I was a person who didn’t participate in isms and was not ageist myself. But we all are some amount to some extent. And I think, you know, there’s a very unkind way to have that manifest itself. And there’s just a more normal human level of that. And I don’t know that I can be the arbiter of what those levels are. But I’m not saying this to be proud of my ageism, but just to be kind of full disclosure with everybody.

When I had that moment, it was like really, really awesome. Like it was just such a great feeling of like it was even joyful, like, oh, I was wrong. I understand this person in a like a much deeper way. And actually to get over myself to be able to do that. My goal is to understand this person in a rich way. That’s what’s really exciting about research and what makes me be able to go get this stuff and kind of bring it back. And so again, they’re not always this extreme, but you often have to get over yourself a certain amount.

So I mean, what was happening for me was I could hear myself. I could hear where I was almost clenching, like trying to steer his story a certain way. And he was steering in another way. And just to feel that tension between what he wanted to tell me and what I wanted to talk about, which happens in every interview to a certain extent. And some of that’s just topic based. But here it was sort of identity based.

And so there was insight for me in that about myself, about the topic, about this guy. And I say that at the risk of being judged for my own bias, but with the hope that we can all get better at hearing our own bias and in the moment and kind of grappling with it and being intentional about the choices we make in the interview to get to what we’re trying to get to.

That’s from the Product Manager podcast. I hope you’ll check out the updated edition of Interviewing Users if you haven’t already. So let’s get to the first half of my conversation with Reggie Murphy, the Senior Director of UX Research at Zendesk.

Reggie, welcome to Dollars to Donuts. It’s so excellent to have you here.

Reggie Murphy: It’s great to be here, Steve. Thanks for the invite.

Steve: I like how we’re talking about here as if we’re in a physical place. We are just on a website talking to each other.

Reggie: We are in a virtual place.

Steve: Yes, I’m virtual here. First of all, what’s your title, your job role?

Reggie: I am Senior Director of UX Research at Zendesk. But for the past couple of months, I have been in an interim role as Head of Design for Zendesk also. So I basically have two jobs right now.

Steve: Just two.

Reggie: So pretty busy.

Steve: For people that don’t know, what is Zendesk?

Reggie: Zendesk is a complete customer service solution. So we help companies develop really positive and connected conversations with their customers through a system that allows them to get feedback from their customers to hear issues and concerns. And our software allows our customers to triage and prioritize in tickets. So our main software is a ticketing system. And so our customers and users can share what their concerns are, issues are, and the system will enable the agents, the customer service agents at our customers to resolve those issues and problems in an efficient way.

Steve: So the users, I guess we have users and we have customers.

Reggie: Right.

Steve: Just thinking from a user research perspective, who is it that you are learning about, what categories of individuals do you learn about in order to improve the product?

Reggie: So our primary customer, if we think about cohorts that we conduct research on, there are two. First, it’s the administrators. And that is the people who set up our platforms. And they could be so they’re like the director of CX, customer experience at an organization. They could be director of customer support. They could also be the chief information officer. So there is a variety of roles of folks who are responsible for purchasing our product and installing it within their organization. And then there is someone that we call admin or an administrator who sets it up so that their organization can use it. So that’s one cohort that we conduct research on to learn how that we best improve the product for them.

And then you have the customer service support agent. And that’s the title in some companies. And that is the person who is on the front lines talking with the end user. So their customer. So for example, Grubhub is a client is a customer. So their end user would be someone who say just ordered some food and maybe has an issue with their order and they go online and they use say the messaging system to alert Grubhub that, hey, I have an issue with my order. I didn’t get the right thing. So that’s the end user. And so the customer service agent at Grubhub is using our system to talk with that end user to help resolve that issue. And so we spend a lot of time researching the admins, how they are setting up the system and the agents and the interface that they’re using in order to talk with the customers that are using their product.

Steve: Do you ever deal with the end user of your clients’ customer support systems?

Reggie: Not a lot. It’s, you know, we have to make strategic decisions about the work that we do because of the size of our team.

We’d love to do more work with the end user. And we’ve done a few projects. But by and large, we spend the vast majority of our time speaking with admins and agents because we feel like that’s where we get the most bang for our buck and who we’re really building for. And I believe if we do the right thing and we build a thing right for the admins and the agents, then the end user will have a delightful experience also, hopefully. Yeah.

Steve: So you’re talking a little about, I guess, where you have to prioritize and what are the areas that you’re thinking about the most. Can you maybe say a little bit about that in terms of establishing research or building research? Where have you decided to look and where have you decided to focus?

Reggie: So when I first joined, and I talk about this in a podcast I recorded for Maze recently, where we were a team of 11 researchers. I have two research operations folks and I have two research managers. So a team of 16, including myself, relatively small, considering we partner with 200 plus scrum teams in the organization, in the broader product organization, which is about 1900 people. So relatively small team.

So when I first joined Zendesk, we were spread out throughout the organization. We were getting work done, but most of the majority of the work was primarily tactical. And we were working towards the end of the product development life cycle, sort of in the evaluation phase. Okay, we’ve already built the thing. Now let’s go and talk to customers and see if it works. And great. We needed to do that type of research, but I believe just philosophically that UX research teams, especially one as small as the one I had inherited when I joined, we needed to be working more towards the beginning of the product development life cycle. Because I thought that that’s where, I think that that’s where the most value can be extracted from a user research team.

Over the past couple of years, we’ve been shifting our work to achieve that goal. And in doing so, the way we’ve done it is we have aligned the work that we do to the company’s OKRs. So each year, a company has about six or seven big OKRs. And then those translate into other OKRs at the organization level. And we align to those big OKRs. And so I’ve spent the last couple of years orienting our team around that structure. And it’s made our ability to conduct more strategic research at the beginning of the product development life cycle better, more efficient, and we’re adding more value.

We still, of course, do tactical research. We still do usability testing and concept testing. Should we go with concept A or B? We still do that. But we’ve seen better success now that we are oriented in this way. And I believe that we’re going to continue as our team, hopefully we get bigger, but let’s say, assume that we stay the same size for the next year or two. I believe this helps us work on the most important work at the company. Now when we do that, obviously there’s some research that still needs to be done that maybe our team can’t do themselves. And that’s what I talked about in the Maze podcast.

And so over the past couple of years, we’ve undergone a pretty extensive effort to democratize some of our research methods. Obviously you can’t, you know, someone who’s not a professional researcher, there is certain types of research that we believe that we can enable them to do. And then there’s other types of research, the more extensive, deep research that we believe that, hey, let us handle it. And so we’ve developed a pretty extensive plan that I’ll talk about in that podcast to help designers or product managers who we may not be able to partner with very closely on their particular area. We’ve enabled them to do very basic user testing using the usertesting.com platform or just basic interviews just to get some feedback or insight for a particular customer problem that they’re trying to understand.

So I think we’ve been pretty successful developing that system over the past couple of years so that we have some balance now. Our researchers are doing really valuable strategic work, foundational research, some tactical research, and we’ve enabled other functions, not researchers, to do research that we aren’t able to help them with on a day-to-day basis.

Steve: I will put a link to that interview in the show notes.This idea of giving other folks tools to do certain kinds of research, democratization, I’ve heard people refer to it as the D-word. It feels like a controversial area. Just having worked on this for a period of time and had success, do you have a hot take on the hot take, I guess, about democratization?

Reggie: I’ve read a lot of hot takes on this lately. I do. You and I have been in this business for quite some time. And those of us who have many years of working in different companies and different settings, I’m a little old school, so I believe that, hey, at one point, let me just say at one point of my career, only researchers should do research. I took a hard line on that.

But when you work with other functions in the way that I’ve worked across my time at Facebook, well now Meta and X, formerly Twitter, and some other companies, you learn that there are other functions like some designers and PMs who also have research experience that if you were able to enable them a little bit and provide them some guidance and the right tools and set the right boundaries and parameters for how they structure a research program, that they can do valid research that they can be confident in.

At one point, yeah, I admit, Steve, it made me cringe that I would allow someone to come in and do a usability test without me or without the team. But I’ve backed away from that in that because we’re now in a world where we have limited resources, headcount, we have constraints on time, we need to build and ship. And so we have to be flexible and adapt. And I think over the years, I have begun to really understand what that means in terms of delegating research. And now I don’t cringe as much because I am working with very capable, very intelligent people who are open and understanding that, “Hey, look, I’m not a researcher, so please help us. We really need assistance here.” And so at Zendesk, we’ve set up a really solid model where we can be confident in the other research work that other people are doing.

So I don’t know if that’s a hot take, that’s a meandering answer, but I no longer have that firm hardline stance that I did back in the day because now I have worked long enough with cross-functional stakeholders that just want to do the right thing. They’re not going, you know, maybe there’s some rogues here and there, but I’m working with people who really want to learn and understand customers and they’re willing to listen and learn and be educated by a research team in order to do it and to do it well so that they can be confident in the outcome.

Steve: And just to affirm the deep roots of that cringe,the fear of what other people will do, way way back, I remember working at an agency doing in context research. And our clients would occasionally ask to show up and sit in. And so we’re not even talking about leading or have any ownership of or any responsibility for, but just be present during. And that was a horrifying concept, that just the mere presence of somebody, they would disrupt this elegant, finely nuanced dynamic. And I think we kind of started to compromise and we would stage this. We talked about re-interviewing somebody or finding a friend of a friend, recruiting something so that we knew it wouldn’t screw up. In a consulting situation, again, early days of the work, so it wasn’t maybe well understood.

Reggie: Right, exactly.

Steve: Just the sense that this was fragile and could be ruined, and that would hurt the work and hurt us and hurt the relationship. I think the roots of what you’re talking about go back and back and back. And when we say them now, it’s ridiculous. What I’m saying sounds ridiculous. How could you ever prevent people from coming or not want them to come? Now we want the opposite, right?

Reggie: I remember some of those days, but I think in the second company I worked for, which was a media company, the Gannett company that owned USA Today, maybe eight or nine years into my career there, we began to do more ethnographic research. This is when I was doing a lot of training with the IDEO company and design thinking was the thing at that moment.

And this is why my stance on this softened because we spent a lot of time in the field with other non-researchers. We would intentionally invite them so that they could see for themselves what customers were saying. We were tired of going out and conducting a piece of research, bringing it back, and people not believing us. You know, because those are questions that come up, “Well, what did they say?” And there was skepticism.

And as much as we tried to convey the value of the work and do all the things to make people feel confident, you know, there was always that bit of skepticism. So we began to empower and engage our stakeholders and invite them to infield assignments. And that’s when I believe that I saw it from a different perspective. I saw some genuine, sincere desire to sit and listen. And we would tell them, we had a sheet and I still have some of these administrative protocol sheets where it’s stakeholder, basically how to be a good stakeholder 101. Here’s what you do. Here’s what you don’t do. Just sit and listen. Don’t say anything. So after a while though, we would allow them to probe, ask questions because we would educate them on how to do it.

Now for those who are still on that hard line today, who would cringe at me even suggesting that we do that, I would say that, yeah, there are probably some cross-functional partners who may not be the type of person who would do very well with this responsibility if you give it to them to do it solo. So then what do you do? Well, at Zendesk, we have sort of a consultancy model. So if you’re a PM, product manager, and you have something that you want to, you have a question, you don’t have a research partner. We have a research request process that we triage these requests. And then we set up time, office hours, time with a researcher, and we’ll consult with you to understand what is the problem that you’re trying to explore and understand. And given the level of urgency or what we believe to be the level of understanding that you need, the precision that you need to have to have a really successful outcome, we’ll decide I think you can do this on your own. Here are some tools. You can go into our learning management platform. You can do our certification course and here are the tools, go for it. Let us know if you have any problems.

That’s one route. The other route is after that consultation, maybe we don’t feel confident and maybe the person who’s asking for help may not feel as confident. And there we can make an informed decision about maybe we go into the direction of, “Hey, we’ll consult with you along the way. It’s not like you’re flying blind.” That gives us, the research team, confidence and comfort that we’re doing the right thing and we’re not sacrificing integrity or quality of the work and the outcomes are what we want or what we envision them or expect them to be from the research.

Steve: I’m having a small, if not big, aha from that. Some of the discussion around democratization is, yeah, look at the nature of the work, the urgency, the complexity. Sometimes it’s around the method. But I think if I’m hearing properly, you’re also saying, look at the team, the requester, and their capability. Educatability is maybe not the right word, but you want the work to be successful and who’s gonna execute it is a factor there. So you choose how to respond. That’s something that you factor in in choosing what path to recommend.

Reggie: Absolutely. I have another example to give you. Recently our company had its biggest customer event. It’s called Relate. It was in Las Vegas this year. The product team took a lot of people to the event this year, which hadn’t really been done previously at an event like this where you have 1,500 customers. And the research team, the design team, we were all there and we set up an area where customers could come and talk with us about various product ideas that we were launching at the event, but also to help us look into the future. We called it co-designing the future of Zendesk. And it was just three researchers, well, four researchers, including myself at the event. We had designers and the remainder were product managers.

So in order for us to be successful at these customer conversations, we developed sort of some tools, some guidance that we shared with the organization. Say, “Hey, if you are speaking with a customer, here are some questions that you can ask them and here’s how you can ask them.” And we enabled them to, I believe, have some really fruitful, enriching conversations. And I felt good about that. I felt good about what we learned. And I think as you grow and mature in your career as a people leader, which I am, but even as a UX researcher, you begin to develop sort of this level of, okay, it can’t be all bad. Like there are ways that you can adapt and get the type of outcome that you expect. Is every piece of research that a non-researcher will do will be perfect? No. Will they make mistakes? Yes. But on balance, I believe that if you set up the right parameters, provide the right guidance and consultation, then you can really get the outcome that you want. I think that we did that at this conference this year.

Steve: did you see some of your non-researcher colleagues being successful in their interactions with customers?

Reggie: It was amazing to see. They were engaging, they were listening, probing in different places appropriately. Because we were sharing some concepts on a big corkboard wall. And to be a good researcher, and I think I made the comment in the Mace podcast that everybody can be a researcher. I’ve read lots of articles where I say, “Everybody can be a researcher.” I truly believe that any time that you are in front of a customer, no matter who you are at the company at Zendesk. So let’s just say Zendesk, I’m just talking about Zendesk right now. Any employee, no matter your title, anytime you are in front of a customer, you can be a researcher. You have to be a researcher, really. Because that’s how you learn what that customer is trying to convey to you. And I was so proud of us and the guidance that we gave them. But in watching how we interacted with customers, you would have thought it was a team full of researchers. And I’m not going to take all the credit for it, but because I think we were just working with, I have an incredible team, teammates, designers and product managers at the company. In order to have a really solid conversation with a user or a customer, in our case, by and large, all you have to do is really listen and ask why and how. And I think, you know what I’m talking about, Steve. You ask those questions that can really help move the conversation along so that you can get to the understanding of what that customer’s need may be. Right.

Steve: It’s simple and easy and really hard all at the same time. Maybe that’s why as a field we have kind of fraught conversations about this. Everyone is a researcher is, I mean, it’s a good hot take because I think it’s a distressing comment and what do we mean by research? What do we mean by researcher? I think you’ve provided the context, but that little pull quote, and I promise not to just pull that out of context on you. But that whole quote, it distresses people.

Reggie: Yeah.

Steve: But I think you’re framing it in a really positive way. And you’re right, it makes me wonder, and I am certainly guilty of this, collecting stories of when we see things go wrong. I mean, I like those because they’re good learning moments, but

Reggie: Absolutely.

Steve: I think there is a tendency or risk of focusing on this narrative that this is hard and people can’t do it. But when you just look on the other side of the mirror like I think you’re doing and saying like, this is hard and people can do it. It unlocks a lot more potential for the outcomes that we all want, the kind of information that we’re looking for.

Reggie: If I can say one more thing about this. What I’ve enjoyed about this process of enabling our cross-functional partners to do research, empowering them to do it, is that it makes them better product managers. It makes them better designers because now they have the customer in mind. And I can’t tell you how many organizations or teams I’ve been on where that wasn’t the case. And I’ve had to kind of sort of jump in the middle of conversation and say, “Hey, wait, has anybody asked the customer what they think about what we’re doing here?” And so I love that the organization that I’m in are thirsty for these opportunities to, number one, talk to customers, but number two, to learn from a team of expert researchers on how to do it. Because in the end, it’s going to make them better.

And I feel very strongly about that, that, hey, we’re not where we want to be, but I think our entire organization is on a really nice journey up the maturity curve on becoming more customer first and understanding the customer better. And these opportunities like the Relate event and just other projects that we’re conducting internally, I think are really good ways for our teammates across functions to learn, grow in their own careers and to be better at understanding how to gather information to make more informed decisions about what we are creating, building and shipping for our customers.

Steve: Maybe I can use that as a segue to go back to something you said quite early on in our conversation. One of the areas where you’ve been pushing to build that maturity, and you talked about shifting where research is focused from kind of evaluative late in the product cycle to more strategic and other things that are taking place earlier in the cycle. And you talked about kind of realigning. And you said a little bit at a high level about how OKRs were kind of the unlock, I guess, to kind of get in there. And because I think you’re describing a very common challenge that maybe less mature organizations face or research leaders face or research teams of one.

I’m only allowed to do this work. No one wants this other work for me. I keep recommending it. And it’s sort of that I think some people are kind of stuck in these little traps where they can’t change the perception of what value they can bring. And therefore, they can’t get the opportunities to bring that value. So could you unpack a little bit more about how you changed what the opportunity was for research at Zendesk?

Reggie: There were a couple of things that I did. First of all, the first six months, I just was assessing what our relationships were like with our key stakeholders. And I came to find out that, you know, while we had some syncs with them, they weren’t necessarily on a cadence. When I say syncs, one-on-one meetings with our design leaders and product leaders and engineering leaders. So, number one, we need to set up a consistent cadence where we are connecting with the leadership of these particular work streams on a consistent basis.

Then we would put ourselves, make sure that we are positioned appropriately in the important conversations that are happening earlier in the product development lifecycle. When the customer problem is being talked about, or what we think are the customer problems, researchers who are in that moment, and they’re in those meetings early on, are able to help that conversation understand, well, how do we know that this is a problem, number one? And I think it was very important at the very beginning to establish a clear and consistent cadence of syncing with other cross-functional stakeholders. So that was the very first thing, and it was developing that relationship. So I think the relationship that we had, and I might be exaggerating, was we have an idea, it’s pretty baked, research, come on in. Let’s tell you about this idea. Now let’s go set up a research program to test it. I’m not saying all the work was like that, but that’s what I observed. I said, we got to fix that. So that was the first, I think, thing that I really tried to do was develop a closer relationship.

Steve: So a very sort of tactical thing, you’re setting up a cadence of one-on-one relationships with the people in these other parts of the organization. And one of the outcomes there is to get into those meetings where a researcher can identify assumptions and challenges and so on. Are those meetings where those conversations are happening, are those literally the one-on-ones you’re setting up?

Reggie: Well, they were the workstream leads meetings. So there’s scrum teams all over, and they’re small, medium, and large sizes. But there were meetings where the design leadership, the product manager, the product marketing manager, maybe there was a data science leader. That core team that is working on that thing, I wanted to make sure that not only were we in those meetings at a consistent cadence, but that we were developing one-on-one relationships with everybody in that group. So the relationship is one thing where you’re doing a one-on-one, and you’re just learning about that person, and you’re really understanding what that person needs and wants.

But from a team level, now you are in the thick of it, and you are in those conversations, like you said, where the customer problem is talked about from the very start. And in addition to that, we installed some protocols or exercises or activities where the researcher will facilitate a conversation around what are the potential customer problems. We call it I Wish I Knew. And this was something that I learned about when I was working at the company formerly known as Twitter. And it was an exercise. It was essentially a brainstorming exercise.

And when I first joined Zendesk, it was right around May and June. And then I guess right around September, October, we started annual planning for the next year. And that was a perfect time to install this activity. So we piloted it in a couple of work streams. This was a moment where the researcher could go into these meetings and say, “This meeting, we’re going to do an IWIK exercise, I Wish I Knew exercise, where as we start thinking about annual planning and the products that we were thinking about building for our customers, what are the questions outstanding that we still don’t have an answer to or areas that we still need to explore?” And so we’d go through that exercise, and then we’d prioritize that list that they would come up with.

I can’t tell you how valuable those exercises were and how our stakeholders looked at us and said, “Wow, I did not know that research could add this kind of value to our conversation,” because it really helped them see. Because especially once you get into the prioritization work, you look and you go, “That question that we’ve been battling around in these meetings isn’t really the one that’s most important. It’s this one.” And to see it all together was sort of a revelation for some of our stakeholders. So when we installed that exercise, it changed the game a bit in terms of our relationship and our position or brand inside the minds of our cross-functional partners, because now we’re providing a value at the very beginning of the product development lifecycle that is helping establish, “Okay, so these are the problems that we need to be worrying about as we think about planning for the next year.” Can’t tell you how important that was.

And then, so if I think about the next part of this was as the company started looking at their annual plan and defining these big buckets of areas that we wanted to work in. So for example, intelligent conversations is sort of one of them, just throwing that out there. And this is all of our sort of artificial intelligence work. And we think about within that big bucket, there’s like maybe, I don’t know, I’m just picking this up, there’s 15 projects or 15 streams of work that need to happen. And we strategically aligned a researcher to that category of work to work with those teams to figure out, well, where can research lean into the most to have the most impact in that work stream that laddered up to that OKR. And when a researcher is in that position and they’re reporting out research, and at the very first couple of slides, this is the research project that is designed to learn 1, 2, 3, A, B, C that will help inform OKR XYZ. That is how you operate. That is how you become a valuable research team. It’s not perfect yet. I think we still have room to grow and get better at it. But over the first couple of years, I think we’re doing a really good job at setting up a structure where we are able to add that kind of value to our product development organization.

Steve: Can I play back some of what I heard?

Reggie: Yes.

Steve: Because I think this is a universal complaint and concern, right? Not getting access, not getting permission. You talked about the individual relationships with people and as part of that, being in those meetings, being in those team meetings. I think it’s on a case-by-case basis, right? How do you get invited to that? But there’s an audacity a little bit of showing up and saying, “Hey, we’re here to be part of this meeting,” then making a contribution. But I think maybe biding your time a little bit. I’m now putting words in your mouth. But building the relationships, being in the meetings, hearing what’s happening, and then choosing a point at which to bring value. And the value that you’re bringing there is facilitative of others. It’s not, “Here’s what we know. Oh, we have the answers.” But just, “Let’s help you talk about what you have to decide and learn.”

Reggie: You’re onto something. So yes, we’re doing that. But in addition, we are bringing some answers to that meeting because one of the other things I did in the first year is we established a research and insights library, a research repository. So we aggregated all the UX research that had been done to date at the company, and we put it in our confluence site, and we set up a nice searching function. So literally every research project that we’ve done that we could find is there. And so in those meetings, when we are having conversations about different customer problems that may not have been answered, we may be able to answer some of them because we now can point to this library and we say, “Hey, wait a second. A couple of years ago, we explored this problem at this angle. So the question that you’re asking, we’ve already covered that piece of it, but maybe we can explore it in this way, in a new way.” And so I don’t know if I’m countering what your assertion was earlier, but I’m basically saying yes, we are able to bring some answers to that meeting for the purpose of helping triage the priority or prioritize the list of questions that come out in the brainstorming of the I Wish I Knew brainstorm session.

Steve: So you are countering it, which was the outcome of me doing, like, “Here’s what I think I heard” question for you. And then I guess the thing that I took away from you sort of describing that presentation where a researcher is part of a work stream and says, “We learned these things. It affects these projects.” I think you said, like, in the first couple of slides, we talk about things like oh, speaking the language of business. Sometimes people talk about you got to know, like, the business model of the company. But I think you’re giving a very actionable definition of what it means to speak the language of business, which is to know what initiatives or what projects or what OKRs to be able to frame the information you’re bringing in terms of what things it ties to that are already what people are concerned about. So doing that translation, so, like, why this is important.

Reggie: Absolutely, because it’s going to ladder up and inform this OKR. Now what I want to mention is this is how we set up our structure to help the researchers align directly with valuable work and strategic work that they can do. But in addition to that, we are also pushing ourselves to even look beyond that specific work and look horizontally. And that has really galvanized our team.

For example, I mean, adoption. If you’re at a B2B company, adoption is a huge priority. So not only are we doing these discrete projects that are laddering up to the OKRs, we’re also stepping back and looking across the organization and saying, okay, well, out of all the projects that we’re working on, what are we learning about customer adoption in all of them? And how can we now inform how the company is thinking about how we help our customers adopt our product and extend it and upgrade, add more? Like what do they need? And so I’m so excited by this work that we’ve recently done around adoption. And we’re seeing a lot of our cross-functional partners leaning in on it and asking us for it. And we’re trying to do a little bit more. But the reason why I’m mentioning this, I just want to sort of set the structure of how we’re maturing as a team is that, yes, we’re aligned to these OKRs.

And it’s actually freeing us up to do the strategic work, but also to pause, step back a little bit and even look higher than that. And I just can’t wait to see what we do for the rest of this year and next.

Steve: What’s the mechanism for different researchers working on different projects? They have different whatever, it’s documents, it’s in their heads. How do you collectively find those overarching issues and kind of pull them together?

Reggie: You know, with the adoption work, it kind of organically happened. We do have check-ins every quarter where our research team, we get together and we do what we call research readouts for that quarter. And this is where a researcher will share all the projects that they’re working on, what they intend to learn. Maybe some of the projects are already in flight, so they’ll give a status update. And in that moment, the purpose of those meetings is to find opportunities where we can collaborate and work together on different projects that are similar. And organically, that’s when these opportunities surface. And I think that’s what happened with the adoption work. There were several projects around adoption. And oftentimes, even with a team as small as ours, that you may not know everything that your research colleague is working on. And that’s why we do these quarterly syncs where they read out what they’re working on. It’s super-duper helpful for us being able to do this.

And that’s what happened. It just sort of organically happened. But I think we’re trying to be more intentional on finding opportunities like that, where out of all the streams of research that we’re conducting, what are we learning that is sort of a global insight that is meaty enough that we can highlight it as a thing that the company needs to lean in on and care about? That to me is where value is from a UX research team, even as small, medium, and large, no matter what size you– if you’re doing that on a consistent basis, senior leadership, executive leadership, they pay attention. And I’ve seen it. And I think what insight has done to us, we’re starting to gain this traction and awareness that we’re doing this kind of work. And I love it. And I encourage anyone who’s listening to this podcast to start thinking in this way, because you can get so buried into the day-to-day research. And you really want to add value to that work stream. And you really want to do well and be successful. But you also have to look ahead. And you also have to look beyond it a little bit. And when you do so, you can find opportunities like that to tackle issues that you may not see in just one particular, one discrete project.

Steve: And that’s the way it is, at least for part one. Stay tuned to this donut channel for part two with Reggie coming soon. A reminder that you can always find Dollars to Donuts at all the podcasty places. Plus, you can visit portigal.com/podcast for all of the episodes with show notes and transcripts. Our theme music is by Bruce Todd.

The post 44. Reggie Murphy of Zendesk (part 1) first appeared on Portigal Consulting.
  continue reading

60 jaksoa

Kaikki jaksot

×
 
Loading …

Tervetuloa Player FM:n!

Player FM skannaa verkkoa löytääkseen korkealaatuisia podcasteja, joista voit nauttia juuri nyt. Se on paras podcast-sovellus ja toimii Androidilla, iPhonela, ja verkossa. Rekisteröidy sykronoidaksesi tilaukset laitteiden välillä.

 

Pikakäyttöopas