Siirry offline-tilaan Player FM avulla!
CRO and A/B Testing with Nick Disabato
Manage episode 182473117 series 1401632
Shownotes:
Tell Us About Your Ecommerce Pains
Email Nick: office@draft.nu
Email Us: podcast@sellry.com
Transcripts:
Michael:
Hey everybody. It's Michael, one of your hosts and welcome to Ecommerce QA. This is the show where store owners, directors of Ecommerce, and Ecommerce managers can stay up to date on the latest and greatest in Ecommerce. Today we are joined, very happily, by Nick Disabato, founder of Draft. Nick, welcome.
Nick:
Happy to be here. Thank you so much for having me.
Michael:
Absolutely. Well, this is the second time, a little bit of déjà vu going on, because was it last week? We started ... We actually were recording this episode, and the audio went all crazy, so now we have to outdo ourselves.
Nick:
Software is terrible.
Michael:
Yeah, yeah. Who writes the software anyways? So Nick, just in case nobody's heard of you ... Which everyone being on the show, if you haven't heard of Nick, you'll be very happy to hear what he has to say. I consider him sort of like the modern godfather of CRO, in terms of strategy and just the thinking behind CRO.
What is CRO? CRO is conversion rate optimization, and Nick just has a wealth of wisdom to share with us today. Thank you, Nick, again, for joining us. Where should we start? Maybe we could start at the beginning. Why did you get into conversion rate optimization?
Nick:
I have a design background. I mostly do UI/UX design, interaction design. That sort of stuff. I thought about what is the thing that I could run in my business, that is kind of in the Venn diagram overlap of stuff I can do on a monthly retainer, and stuff that is still kind of UX-y in nature. Still trying to improve software to make it easier to use. That sort of stuff.
I settled on A/B testing, because it's something that is kind of an ongoing process, and it's not something that has to be kind of a self-contained deliverable, right? You're not building a wire frame a month, right? That doesn't really work for people. You're not doing IA research every month, or at least nobody would pay for it.
So that, I launched about four years ago now and it's kind of evolved into more of an end-to-end, research-driven, CRO engagement, where I am talking to your customers, and taking a look at your analytics, and heat maps, and surveying people, and doing everything that I can to understand what their motivations are, and what their actual behavior and practices. I use that to drive new A/B testing insights, so there's very few, if any, call to action button collar tests, or other type of things that you see on Get Rich Quick case study type scenarios.
Michael:
Can you give me a couple of examples of some really cool A/B tests that you've done?
Nick:
One of my favorites is for a ... They're like an everyday carry company called Key Smart, and they used to have like five different models of Key Smarts. Basically like a Swiss Army knife for your key chain and I ran a test that paired back their entire site to one model and ... Turned out inconclusive. It didn't move the needle in either direction. We didn't increase their conversion rate. We didn't decrease their conversion rate. Just basically everything stayed the same, but people were buying one model of Key Smart instead of varying shares of five different Key Smart's. The consequence of that was that they were able to remove those products from their offerings, from their product line and they reduced manufacturing and shipping expenses by something like a third. It was some totally preposterous amount. So it wasn't just, you know, the lesson not of that the CRO and A/B testing, you're measuring the economic impact of a designed decision. And that can often result in the increase of conversion rate. But if your reducing expenses you're still getting a win for the business. The goal is profit right? And so you can do that by increasing revenue the most common part of CRO or also decreasing expenses, so that one kind of surprising thing I've done recently.
Michael:
I love this so much because its really easy for companies to think of, in a sandbox way, about digital and about Ecommerce specifically, it's like, oh that's online but no. In this case you were looking at something that was really a very valuable insight about the customer, essentially, which is people just wanted the product they didn't actually care about the color of the product. Driving down the operational and manufacturing costs around that. That's fantastic.
Have you had any other kind of outside the box cool case studies or experiences like that where you're expecting to maybe drive something that was more just digital but then you found this deeper insight about the company?
Nick:
There was one for the Wire Cutter where ... I can't believe I'm even citing this test but it was a call to action test of all things. It's like, one of the only ones I've run that I've seen work. But it was what you're talking about, kind of that deeper insight and so there if don't know the Wire Cutter they're basically tech blogger currently owned by the New York Times. I was working with them when they were an independent company and they had basically, if you've never seen them, they're basically consumer reports for millennials because consumer reports incorrectly fire walled all their stuff.
The Wire Cutter's business model's oriented around ... We do hundreds of hours of research to find the best thing, much like consumer reports does, but then there's the link out to Amazon and you buy it and get affiliate kick backs and so I think something like 80% of the revenue at one point was affiliate kick backs at least that I know of. A high share. So, we changed call to action button callers so instead of them all being one color, they were like Amazon orange, Walmart blue, Apple Store warm gray. You know, that sort of stuff.
The deeper insight is that people look at that and believe its more trustworthy because it looks like its lightly branded with the stores branding. But it still had Wire Cutter futura in it. It was the seam between the Wire Cutter and the third party vendor. That ended up fairing extremely well and encouraged ... The biggest metric that we have was clicks out in that situation because we don't know whether somebody's bought something on Amazon but we know that it roughly correlates, right? We know that people are able to bee-line to that and they might wish list it and you get the affiliate kick back later, that sort of thing. And it'll bore out in the final numbers.
Michael:
That's great. Good. I wish we could ask you what the final numbers were but I won't ask that.
Nick:
No.
Michael:
So you said something interesting, which was ... You almost didn't want to mention it because it was a button caller test. I know, we've talked about button caller tests because that's like the prototypical, oh we changed our button caller and our revenue doubled, right?
Nick:
Yeah. That's what I was mentioning about. I was kind of subtweeting these like ... People get A/B testing ideas from other case studies because they have a sense that it's what works, but that's not how I go about finding revenue generating design decisions and I found that really the only way to do it is by actually researching and people get very bored or allergic to the idea of research cause you associate it with being at a library and looking at an encyclopedia that exists for some reason and writing a five paragraph essay but for me research is just looking at where there's revenue leakers and listening to customers about their motivations. These are practices that you should be doing in your business no matter what, right? So I often come in and I say, hey A/B testing, CRO, here's a bunch of sexy stuff and then I make you eat your vegetables.
Michael:
So talk to us about that because you've ... As you mentioned before there's two types of research that this involves. Qualitative and Quantitative. I think most people on the call know what that means. Qualitative has to do with the quality of the difference or how something feels, looks, that sort of thing. Maybe simply something to measure, or maybe not, and then quantitative would be like the classic number of clicks, number of this, number of that revenue and so on. How does that work on the research side?
Nick:
So for me, quantitative is more like what you're typically doing with Google Analytics or with Heat Maps or something like that so you have a certain number of people that are clicking on a thing or going in a certain path and that sort of thing. So, there are numbers you get out of it, right?
Your conversion rate is a quantitative insight. The share of people that are going from your home to your pricing to your product to your cart to whatever. That funnel is a quantitative insight. Now the share of traffic that you're getting from your Facebook ads is a quantitative insight. I'm taking all of those things and I'm thinking about what it is people are actually doing, right? It's the what and the how.
The qualitative insights are the why, right? It's the more squishy things that are what drives a person. What is the value proposition? What are we saying to them, right?
You need roughly equivalent shares of both of those things. It's very easy ... Like I'm a nerd and I was a math major. It's very easy to just retreat to the numbers for a lot of businesses that I work with. It's the same aversion to getting on the phone and actually having a conversation with somebody, especially a stranger you've never met before. Qualitative responses can include, yes I'm actually literally getting on a Skype call or Facetime call with somebody or I'm asking them a bunch of questions but it can also mean doing post purchase surveys or life cycle emails and mining responses for what's going on. It can mean talking to your support team and understanding where the pain points are with certain things.
In Key Smarts case we got a lot of insight out of the support team dealing with like assembly. We didn't have an assembly guide. So we put that pretty front and center on the product page and it sold more because people felt comfortable being able to actually assemble the dang thing. Fewer people were returning them later saying I don't know how to work with this thing. We inserted assembly guides in the actual physical product so when you get it in the mail you get a little business card that shows you how to do it.
Those are things that they're the practical considerations like that and then there's kind of broader higher level things like, how are we communicating as a brand? What is our voice and tone? What problem are we specifically stating that we solve? A lot of people come in the door thinking that they know the answer to that and we never end up in the same place after qualitative research. We always end up a little bit further afield or maybe with a refinement of those insights and we end up coming up with something that works better not only for the business but also for the customer because then they feel more comfortable buying the thing. It's not a matter of manipulating the person. It's a matter of making them feel more empowered by the product that you have to offer.
Michael:
I think its interesting that in Ecommerce, a lot of the decisions are made by very scrappy people. I love scrappy people. I would most of the time consider myself a scrappy person. What I mean by that is just getting there and trying things. And throwing spaghetti at the wall and seeing what sticks and saying, oh that didn't work, I'm gonna start throwing meat balls at the wall. Oh, those stick better. Great. But I think what were running into with Ecommerce is that it's becoming the men verses the boys and to really compete you can't just guess anymore. You actually have to have data like your saying. Let's say that somebody wants to get started and they ... I mean, they're probably gonna want to just hire you for your excellent services but lay it out for us. What are the three steps lets say to get started in conversion rate optimization?
Nick:
So the first thing is get everything configured and fix all the bugs. That's the first step. Usually your analytics have been gathering dust and cobwebs or ... That's in the worst case. The next worst case is you have analytics but they're not configured correctly. If you go into your goals, your funnels or something like that, they're busted. After that you take a look at your analytics and you realize that your conversion rate on mobile is one third of what it should be and even that is one third of what it should be compared to desktop. So you have a lot of bugs to fix, right?
There was one client I worked with where ... I didn't even run an A/B test for this. I reduced their page rate by something like a third and their page load time was initially like 16 seconds. Some preposterously high number. Right. Right. It was some preposterously high number and we got it down to like 9, which is still bad. Like that's not okay but its not like stage and intervention level, it's just, this is bad. Their conversion rate went up by 9.12%, right? I hadn't run a single A/B test yet.
You've got to prepare the sight and do the one off optimization stuff that is just unsexy, dumb, scrappy grunt work that you know you should be doing and you're not doing. If it requires you hiring a really fancy consultant in order to do that, fine. That's how I have a job in part but also I wish people would just do it. You know? I have a million how to guides out there that I've written and I've cited in other sites like Conversion Excel and Copy Hackers that do this stuff.
The first thing is prepare the site and get it working correctly and the best thing about that, not only are you making ... more money, but hopefully your conversion rate goes up which means you'll be able to get to statistically significant A/B tests a little bit better.
Michael:
Getting to a good base line ...just a crazy point. Reasonable point to start with.
Nick:
Yeah. You think you can skip this step. A/B testing is how the ... It's not how the bad get good its how the good get better, right?
Michael:
I love that.
Nick:
So you have to get to a point where at the base line ... Best practices in order to be doing this. The second thing is to start researching. You have to do kind of a blend of quantitative and qualitative research as I mentioned. The easiest and dumbest ways to do this are both by looking at your analytics, taking a look at peoples funnels, that sort of thing, but also running Heat Maps. Kicking off a Heat Map involves signing up for an account with lets say, hotjar.com and entering a java script tracking snippet. Then five minutes of typing in the right URL to be like, do Heat Maps here. Then you have Heat Maps, congratulations. The other thing that I like doing is adding in a post purchase survey. So if you're on Shopify, there are a ton of plug ins that allow you to do this that on the confirmation page they say okay, great. What was the last thing that held you back from making this purchase or how do you feel about this transaction? Something like that. Something open ended.
Michael:
I love that.
Nick:
Then you just sit there and embed a Google form. Something as stupid as that.
Michael:
You can do Hot Jar too for that.
Nick:
You can do Hot Jar for that as well, yeah. It's a little more sophisticated to be doing that. I do the Hello World thing. I embed a Google form or a WooHoo form because it takes me 10 minutes. If you're that lazy or busy or whatever, do that and just get the outcome there. Now, congratulations you have a blend of quantitative ... Heat map and qualitative ... I'm getting actual free text responses back from my customers insights. I wouldn't stop there. That's the base line there, right? You start the researching.
Then the third thing you're doing is, once you get the research back there's a process called synthesis where you're taking the research ... and for me this is the fun part. I kind of have a three step process within this. Research synthesis is the process of taking research and turning it into revenue generating design insights that you can test on your site.
So, the first part is you try and identify places where you're leaking revenue or opportunities for improvement. Let's say you run a Heat Map and a lot of people are just bee-lining to the about page and then they bounce off. That happens a lot, right? You can speculate as to why that's happening, right? So you identity the problem. That's the first part.
The second thing is, you come up with an inference as to why that might be happening, right? People are doing that because it wasn't expressed on the product page. That might be one thing. It might be because they're just show rooming and they're trying to go to Amazon to buy your thing. That's another possible speculation. It's a little bit harder to address. If you come up with the answer being I don't know, you go back to step one and do a little bit more research. And you figure out, okay well, maybe I need to run a usability test where I go to user testing.com and I get somebody to vocalize their internal monologue as their going through trying to make a dummy purchase on this site. Something like that.
No matter what, once you get to the point where you have enough research to have a hunch about it, the third step is coming up with a design that addresses the hunch. I know this is easy to say as a designer but for me this is the very easy part because I already have some degree of clarity about what it is the thing is and I've come to a consensus maybe with the rest of my team about why it is that why. Because that speculation, that's what it is. You're coming to an inference about it. You're trying to make a conclusion on it and that's scary and possibly unsubstantiated. Once you get to a design solution ... Usually once you have the guess, the design solution kind of naturally falls out, right?
So in this case I might add an assembly guide or a little bit of an about stuffing on the actual product page and I would address it in that way. That's something that I test and I determine if it increases the add to cart rate. The goal is to get people to kind of the next step in the funnel. Of course your tracking other things like ARPU, AOB, that sort of stuff. All the key metrics that you would be doing.
Michael:
ARPU?
Nick:
Average Revenue Per User. That's basically-
Michael:
I can't believe that I have never heard of that metric.
Nick:
Don't worry about it. It's basically-
Michael:
Those guys.
Nick:
Other people that ... Yeah. ARPU, he's my friend. Yeah, he ... No. Check his blog.
What was I saying about that? So basically you take the whole ... Everybody that hits this page and divide it by the amount of revenue that that page generates. Or it's the other way around. You divide the amount of revenue by the amount of people.
Michael:
I know why. Because it's a four letter acronym man. I only remember three letter acronyms. ARPU.
Nick:
Yeah. Those TLAs, man. You're getting all of these metrics back and then you're trying to figure out what the impact of the design decision was.
Something you may be noticing in all of this is that one ninth of the process is in A/B test, right? It's not about the actual test even though that is the sexiest thing and usually why you're hiring somebody like me.
A/B testing is the tool, right? Like, if you're building a house, it would be like focusing on the hammer as the really cool thing and not all of the materials and process and blueprints that are necessary to get to the point where you're using the hammer effectively to create a house that won't collapse or leak. I think that eventually people will kind of clue into this process but for me that's kind of how I follow it.
Michael:
That's really insightful. I used to build houses or I was going to build houses as my career and actually really love framing but bailed on that. But that whole idea of focusing too much on the tool is so important because so much of the time it's like, what's the easiest solution? Is it to use this tool? Is it not? It's so easy to get budget for using some fancy acronym.
What is it this year? Its AI. There's a lot of companies that are like, we need to be doing something in AI. Why do we need to be doing something in AI? What problem are we actually solving?
So I think honestly, I think a lot of the reason why people who actually understand anything about math and statistical analysis, conversion rate optimization and these other disciplines is because it feels like, oh we're going to have numbers, we're going to be able to use these numbers.
What I find is that it usually hits a wall. Conversion rate optimization programs usually get started and then they just kind of peter off and it never goes anywhere. Which, I feel is leaving so much on the table. You know? If you think about it, redo your website lets say and it looks great. Chances that there's a lot of things that if you'd make slight adjustments, things would really fall into place. People would understand it so much better. I'm kind of wondering if that's one of the problems that you solve is helping people not have to feel like their conversion rate optimization is going nowhere. Is that-
Nick:
Sometimes. Sometimes. I mean this morning I gave a client a report that was basically like, here are three tests. They were all inconclusive. I don't like giving bad news. It's even worse to have inconclusive tests than to have outright failing tests because inconclusive tests teach you functionally nothing other than we probably shouldn't do that approach.
A lot of it is, you have brick of marble and you're trying to carve the David out of it. Okay, then carve out everything that's not the David. That seems very counterintuitive. At the same time you're doing that and trying to convey the results of these tests ... Trying to convey the mind set shift that's necessary to think about it in a truly research and design driven way, right?
A lot of people hire me, at least, because they want their conversion rate to go up and I get it, I understand that. That is why I'm here. Hopefully after a certain amount of time your conversion rate will go up. If not, I should be firing myself. It's not just that, you have to kind of come in a different direction from what you're necessarily thinking. It's not a situation where you're testing to settle a debate internally. That is not going to make your conversion rate go up. You're not testing because an agency came up with a comp and you think it's interesting. That's not a good use of my time. It's not a good use of anyone's time.
Testing time is finite because you have one page, usually, and you're testing that page and its load bearing on the rest of your funnel, right? I know that there's a lot of pages in your funnel but you can only test a store page at one time. You can only test a product page at one time. You're running into a situation where if you're wasting test time on this, just wheel spinning that's not research driven, you're wasting time on several fronts, right? It's not even just that you're paying a lot of money on an A/B testing tool but also other people that are competitors are doing this right. They're going to eat your lunch eventually.
Michael:
I have two practical questions for you. One is, I find that ... I have a client where they are amazing at conversion rate optimization. They've taken several years and methodically CRO tested every single part of their website with the result that their website looks horrendous. Now, we're familiar with this problem because-
Nick:
The frankentest.
Michael:
Yeah. How do you address that? Because it seems actually like a lot of people feel like there's this situation where you can either do the thing that's the best for conversions by having flashing yellow banners all over the place to get people to sign up for the email or whatever it is-
Nick:
Please don't do that.
Michael:
Yeah. Or having a pretty website.
Nick:
So, those aren't necessarily mutually exclusive. I think the second question is the difference between beauty and conversion driven design. I'll address that in a moment. Let's talk about the frankentest for a moment.
Part of researching your test ideas is so that you can understand what battles to be fighting, right? Another thing about A/B testing in particular is that you can't cheat statistics. So if you have a test that wins at 55% confidence, that actually tells you almost nothing. So it's not that the neon yellow background won, it's that you had slight noise in your sampling. You need to be getting wins or losses with 95% in up confidence. Even that is conservative for some of my clients. I run most of my biggest clients until 99% confidence because it's a matter of two more days of testing. That gives us more certitude in what we're doing.
Within this entire consideration is you chose to test the background because you hadn't tested it before. That's not what your customers are telling you. Nobody gives a crap about your background, right? They don't care about the individual elements. They care about overall what it is you're trying to solve. They care about probably the text more than anything. In my experience, they care about the usability and the functionality of your cart. They care about the ability to pay you well and the ability to get free shipping and other incentives. That sort of stuff. That's the kind of things that you need to be testing. You can do that on a pretty website. If you are finding yourself running out of test ideas, the answer is to research more and not test things that don't matter. That's my take on that.
As far as the tension between data and beauty and functionality, I mean I even wrote about this on my mailing list a couple weeks ago ... But there's a famous anecdote, in like 2009 or 2010 where Doug Bowman who was a very high up at Google in design there who quit and became the design director of Twitter. He was reporting directly to the CEO ... Whomever it was at the revolving door at the time there. He wrote a thing about why he quit Google and one of the big reasons was that they ran an A/B test on 42 different shades of blue for the primary link color. He said it just drove him batty, right?
The problem is that both Google and Doug Bowman are right. Google is right to be doing that because that will probably set the link color into perpetuity and frankly Google gets enough data that they can run a 42 variation A/B test, right? Doug Bowman is right because if you want to run a classical design practice, that's not the way to go about doing it, right? If you care about that kind of beauty and functionality.
Fortunate thing for you, dear listener, is you probably don't work at Google. You probably don't work at a business large enough to run 42 branch A/B tests. You should be trying to embrace that. So you should be having a style guide in place. Be flexible with it of course. It might be that data doesn't back up having a low contrast ratio or poor functionality or usability and stuff like that.
I've personally found that I can have my cake and eat it too on this front. The best way to go about doing that is ... Shocking no one, through researching customer motivations and realizing that changing stuff to ugly nonsense and doing things that are predatory from a UX standpoint like putting a huge modole in or a huge blinking. It's like a sugar high. You get a short term boost from it and then in the long term it doesn't actually benefit your business and it results in a reduction of credibility or people kind of abandon that mechanism that you've tried as a short term fix because all of the evil people have glommed on to something else. So you look like yesterdays news. I've seen that quite a bit. There are a lot of structural disincentives to do that. If you're gross and unethical, by all means throw a bunch of nonsense on your page and do that. I can't stop you. It makes it unlikely that I'll work with you.
Michael:
Yeah but the physical where we spent months and months and we brought in ... actually a mutual friend of ours ... Rob Williams. We collaborated on this project. He did the most beautiful design I've ever seen on any website. They trashed it. And I just felt like, wow, you're vandalizing your own website. Why? To get a few conversion points? Ultimately you're damaging your brand? I mean they were trying to be a luxury brand and I was like, oh my gosh you guys don't get it.
Nick:
I mean, especially with luxury brands or brands that are meant to communicate with anybody like my age or under. You're hurting yourself really severely if it doesn't play on insta. I'm dead serious. If you're going to go and trash it, that's just a sign that designers should be on retainer to make sure that that doesn't happen. Then they get to be the fun ruiner by constantly defending themselves against ... The problem is probably the toxic culture in that situation where their constantly averse to design and that'll bite them eventually.
Michael:
It will. The funny thing is ... The sad thing I should say, they won't even know it. They'll be wondering after five years ... In a similar way that we all found out 10 years ago with blackout SEO. Don't do blackout SEO. It will kill your business eventually.
Nick:
Google will be very sure of killing your business, right? It's, yeah it'll come back to you.
Michael:
Very insightful, thank you. The second thing is, how can people learn more ... How can they do CRO right? Maybe some of them will want to work with you. I hope they do. I've seen the results that you're able to drive and I definitely want everybody listening to go and see what you offer.
Nick:
You can go to my website at draft.nu if you want to hire me tack a row slash revise onto that. That's basically the quarterly A/B testing that I use. That's draft.nu/revise. If you never want to actually see my face you can go and buy my course called the A/B testing manual at Abtestingmanual.com. That will teach you everything you need to know and spares you the expense of having to work with me.
Michael:
Although if you go through that course you're going to be like, wow whover wrote this is really smart so I should probably, better hire him.
Nick:
That's probably likely. That's happened.
Michael:
Well cool, Nick. There's so many questions I have but in the interest of time we'll have to do this again.
Nick:
I would love to.
Michael:
If... here, we're going to include all of the notes in the show notes. Do you think we could get the copy of the email you mentioned that you put out?
Nick:
Yeah. I can definitely provide that.
Michael:
Alright cool. I'll run a link to that so that everybody can learn from that. Everybody, this has been great. In the show notes you'll find everything. Just go to ecommerceqa.com for those show notes. We've got a little something ... Speaking of research and all that, we want to understand all of the listeners pain points are right now in Ecommerce. So if you're running an Ecommerce store or you're thinking about doing it, what we've done is we've put together a little survey that we're just going to share all of the results with everybody who signs up. We're not trying to push something. We just want to understand what you would like to hear us talk about more on the show. We talk about lifestyle stuff, we talk about consumption psychology, we talk about really practical Ecommerce, strategic and practical matters. Your Ecommerce matters are important. To get to that survey, what we want you to do is go to sellry.com/survey. S-E-L-L-R-Y. Two L's in there, dot com forward slash survey. If you have any questions for us or Nick, send an email to podcast@sellry.com our end or Nick do you like people to email you?
Michael:
Yeah. They can email me at office@draftnu and that goes to everybody in the company, which is a very small company and I'll answer it or somebody extremely qualified will also answer.
Nick:
We're so honored that you could join us today, Nick. I've been following you personally for a really long time. I've learned so much from you and now I've just learned a whole lot more. Thank you so much.
Michael:
Thank you so much for the kind words, I really appreciate it.
Nick:
Yeah, absolutely. Alright, everybody. That's a wrap. Talk to you later.
34 jaksoa
Manage episode 182473117 series 1401632
Shownotes:
Tell Us About Your Ecommerce Pains
Email Nick: office@draft.nu
Email Us: podcast@sellry.com
Transcripts:
Michael:
Hey everybody. It's Michael, one of your hosts and welcome to Ecommerce QA. This is the show where store owners, directors of Ecommerce, and Ecommerce managers can stay up to date on the latest and greatest in Ecommerce. Today we are joined, very happily, by Nick Disabato, founder of Draft. Nick, welcome.
Nick:
Happy to be here. Thank you so much for having me.
Michael:
Absolutely. Well, this is the second time, a little bit of déjà vu going on, because was it last week? We started ... We actually were recording this episode, and the audio went all crazy, so now we have to outdo ourselves.
Nick:
Software is terrible.
Michael:
Yeah, yeah. Who writes the software anyways? So Nick, just in case nobody's heard of you ... Which everyone being on the show, if you haven't heard of Nick, you'll be very happy to hear what he has to say. I consider him sort of like the modern godfather of CRO, in terms of strategy and just the thinking behind CRO.
What is CRO? CRO is conversion rate optimization, and Nick just has a wealth of wisdom to share with us today. Thank you, Nick, again, for joining us. Where should we start? Maybe we could start at the beginning. Why did you get into conversion rate optimization?
Nick:
I have a design background. I mostly do UI/UX design, interaction design. That sort of stuff. I thought about what is the thing that I could run in my business, that is kind of in the Venn diagram overlap of stuff I can do on a monthly retainer, and stuff that is still kind of UX-y in nature. Still trying to improve software to make it easier to use. That sort of stuff.
I settled on A/B testing, because it's something that is kind of an ongoing process, and it's not something that has to be kind of a self-contained deliverable, right? You're not building a wire frame a month, right? That doesn't really work for people. You're not doing IA research every month, or at least nobody would pay for it.
So that, I launched about four years ago now and it's kind of evolved into more of an end-to-end, research-driven, CRO engagement, where I am talking to your customers, and taking a look at your analytics, and heat maps, and surveying people, and doing everything that I can to understand what their motivations are, and what their actual behavior and practices. I use that to drive new A/B testing insights, so there's very few, if any, call to action button collar tests, or other type of things that you see on Get Rich Quick case study type scenarios.
Michael:
Can you give me a couple of examples of some really cool A/B tests that you've done?
Nick:
One of my favorites is for a ... They're like an everyday carry company called Key Smart, and they used to have like five different models of Key Smarts. Basically like a Swiss Army knife for your key chain and I ran a test that paired back their entire site to one model and ... Turned out inconclusive. It didn't move the needle in either direction. We didn't increase their conversion rate. We didn't decrease their conversion rate. Just basically everything stayed the same, but people were buying one model of Key Smart instead of varying shares of five different Key Smart's. The consequence of that was that they were able to remove those products from their offerings, from their product line and they reduced manufacturing and shipping expenses by something like a third. It was some totally preposterous amount. So it wasn't just, you know, the lesson not of that the CRO and A/B testing, you're measuring the economic impact of a designed decision. And that can often result in the increase of conversion rate. But if your reducing expenses you're still getting a win for the business. The goal is profit right? And so you can do that by increasing revenue the most common part of CRO or also decreasing expenses, so that one kind of surprising thing I've done recently.
Michael:
I love this so much because its really easy for companies to think of, in a sandbox way, about digital and about Ecommerce specifically, it's like, oh that's online but no. In this case you were looking at something that was really a very valuable insight about the customer, essentially, which is people just wanted the product they didn't actually care about the color of the product. Driving down the operational and manufacturing costs around that. That's fantastic.
Have you had any other kind of outside the box cool case studies or experiences like that where you're expecting to maybe drive something that was more just digital but then you found this deeper insight about the company?
Nick:
There was one for the Wire Cutter where ... I can't believe I'm even citing this test but it was a call to action test of all things. It's like, one of the only ones I've run that I've seen work. But it was what you're talking about, kind of that deeper insight and so there if don't know the Wire Cutter they're basically tech blogger currently owned by the New York Times. I was working with them when they were an independent company and they had basically, if you've never seen them, they're basically consumer reports for millennials because consumer reports incorrectly fire walled all their stuff.
The Wire Cutter's business model's oriented around ... We do hundreds of hours of research to find the best thing, much like consumer reports does, but then there's the link out to Amazon and you buy it and get affiliate kick backs and so I think something like 80% of the revenue at one point was affiliate kick backs at least that I know of. A high share. So, we changed call to action button callers so instead of them all being one color, they were like Amazon orange, Walmart blue, Apple Store warm gray. You know, that sort of stuff.
The deeper insight is that people look at that and believe its more trustworthy because it looks like its lightly branded with the stores branding. But it still had Wire Cutter futura in it. It was the seam between the Wire Cutter and the third party vendor. That ended up fairing extremely well and encouraged ... The biggest metric that we have was clicks out in that situation because we don't know whether somebody's bought something on Amazon but we know that it roughly correlates, right? We know that people are able to bee-line to that and they might wish list it and you get the affiliate kick back later, that sort of thing. And it'll bore out in the final numbers.
Michael:
That's great. Good. I wish we could ask you what the final numbers were but I won't ask that.
Nick:
No.
Michael:
So you said something interesting, which was ... You almost didn't want to mention it because it was a button caller test. I know, we've talked about button caller tests because that's like the prototypical, oh we changed our button caller and our revenue doubled, right?
Nick:
Yeah. That's what I was mentioning about. I was kind of subtweeting these like ... People get A/B testing ideas from other case studies because they have a sense that it's what works, but that's not how I go about finding revenue generating design decisions and I found that really the only way to do it is by actually researching and people get very bored or allergic to the idea of research cause you associate it with being at a library and looking at an encyclopedia that exists for some reason and writing a five paragraph essay but for me research is just looking at where there's revenue leakers and listening to customers about their motivations. These are practices that you should be doing in your business no matter what, right? So I often come in and I say, hey A/B testing, CRO, here's a bunch of sexy stuff and then I make you eat your vegetables.
Michael:
So talk to us about that because you've ... As you mentioned before there's two types of research that this involves. Qualitative and Quantitative. I think most people on the call know what that means. Qualitative has to do with the quality of the difference or how something feels, looks, that sort of thing. Maybe simply something to measure, or maybe not, and then quantitative would be like the classic number of clicks, number of this, number of that revenue and so on. How does that work on the research side?
Nick:
So for me, quantitative is more like what you're typically doing with Google Analytics or with Heat Maps or something like that so you have a certain number of people that are clicking on a thing or going in a certain path and that sort of thing. So, there are numbers you get out of it, right?
Your conversion rate is a quantitative insight. The share of people that are going from your home to your pricing to your product to your cart to whatever. That funnel is a quantitative insight. Now the share of traffic that you're getting from your Facebook ads is a quantitative insight. I'm taking all of those things and I'm thinking about what it is people are actually doing, right? It's the what and the how.
The qualitative insights are the why, right? It's the more squishy things that are what drives a person. What is the value proposition? What are we saying to them, right?
You need roughly equivalent shares of both of those things. It's very easy ... Like I'm a nerd and I was a math major. It's very easy to just retreat to the numbers for a lot of businesses that I work with. It's the same aversion to getting on the phone and actually having a conversation with somebody, especially a stranger you've never met before. Qualitative responses can include, yes I'm actually literally getting on a Skype call or Facetime call with somebody or I'm asking them a bunch of questions but it can also mean doing post purchase surveys or life cycle emails and mining responses for what's going on. It can mean talking to your support team and understanding where the pain points are with certain things.
In Key Smarts case we got a lot of insight out of the support team dealing with like assembly. We didn't have an assembly guide. So we put that pretty front and center on the product page and it sold more because people felt comfortable being able to actually assemble the dang thing. Fewer people were returning them later saying I don't know how to work with this thing. We inserted assembly guides in the actual physical product so when you get it in the mail you get a little business card that shows you how to do it.
Those are things that they're the practical considerations like that and then there's kind of broader higher level things like, how are we communicating as a brand? What is our voice and tone? What problem are we specifically stating that we solve? A lot of people come in the door thinking that they know the answer to that and we never end up in the same place after qualitative research. We always end up a little bit further afield or maybe with a refinement of those insights and we end up coming up with something that works better not only for the business but also for the customer because then they feel more comfortable buying the thing. It's not a matter of manipulating the person. It's a matter of making them feel more empowered by the product that you have to offer.
Michael:
I think its interesting that in Ecommerce, a lot of the decisions are made by very scrappy people. I love scrappy people. I would most of the time consider myself a scrappy person. What I mean by that is just getting there and trying things. And throwing spaghetti at the wall and seeing what sticks and saying, oh that didn't work, I'm gonna start throwing meat balls at the wall. Oh, those stick better. Great. But I think what were running into with Ecommerce is that it's becoming the men verses the boys and to really compete you can't just guess anymore. You actually have to have data like your saying. Let's say that somebody wants to get started and they ... I mean, they're probably gonna want to just hire you for your excellent services but lay it out for us. What are the three steps lets say to get started in conversion rate optimization?
Nick:
So the first thing is get everything configured and fix all the bugs. That's the first step. Usually your analytics have been gathering dust and cobwebs or ... That's in the worst case. The next worst case is you have analytics but they're not configured correctly. If you go into your goals, your funnels or something like that, they're busted. After that you take a look at your analytics and you realize that your conversion rate on mobile is one third of what it should be and even that is one third of what it should be compared to desktop. So you have a lot of bugs to fix, right?
There was one client I worked with where ... I didn't even run an A/B test for this. I reduced their page rate by something like a third and their page load time was initially like 16 seconds. Some preposterously high number. Right. Right. It was some preposterously high number and we got it down to like 9, which is still bad. Like that's not okay but its not like stage and intervention level, it's just, this is bad. Their conversion rate went up by 9.12%, right? I hadn't run a single A/B test yet.
You've got to prepare the sight and do the one off optimization stuff that is just unsexy, dumb, scrappy grunt work that you know you should be doing and you're not doing. If it requires you hiring a really fancy consultant in order to do that, fine. That's how I have a job in part but also I wish people would just do it. You know? I have a million how to guides out there that I've written and I've cited in other sites like Conversion Excel and Copy Hackers that do this stuff.
The first thing is prepare the site and get it working correctly and the best thing about that, not only are you making ... more money, but hopefully your conversion rate goes up which means you'll be able to get to statistically significant A/B tests a little bit better.
Michael:
Getting to a good base line ...just a crazy point. Reasonable point to start with.
Nick:
Yeah. You think you can skip this step. A/B testing is how the ... It's not how the bad get good its how the good get better, right?
Michael:
I love that.
Nick:
So you have to get to a point where at the base line ... Best practices in order to be doing this. The second thing is to start researching. You have to do kind of a blend of quantitative and qualitative research as I mentioned. The easiest and dumbest ways to do this are both by looking at your analytics, taking a look at peoples funnels, that sort of thing, but also running Heat Maps. Kicking off a Heat Map involves signing up for an account with lets say, hotjar.com and entering a java script tracking snippet. Then five minutes of typing in the right URL to be like, do Heat Maps here. Then you have Heat Maps, congratulations. The other thing that I like doing is adding in a post purchase survey. So if you're on Shopify, there are a ton of plug ins that allow you to do this that on the confirmation page they say okay, great. What was the last thing that held you back from making this purchase or how do you feel about this transaction? Something like that. Something open ended.
Michael:
I love that.
Nick:
Then you just sit there and embed a Google form. Something as stupid as that.
Michael:
You can do Hot Jar too for that.
Nick:
You can do Hot Jar for that as well, yeah. It's a little more sophisticated to be doing that. I do the Hello World thing. I embed a Google form or a WooHoo form because it takes me 10 minutes. If you're that lazy or busy or whatever, do that and just get the outcome there. Now, congratulations you have a blend of quantitative ... Heat map and qualitative ... I'm getting actual free text responses back from my customers insights. I wouldn't stop there. That's the base line there, right? You start the researching.
Then the third thing you're doing is, once you get the research back there's a process called synthesis where you're taking the research ... and for me this is the fun part. I kind of have a three step process within this. Research synthesis is the process of taking research and turning it into revenue generating design insights that you can test on your site.
So, the first part is you try and identify places where you're leaking revenue or opportunities for improvement. Let's say you run a Heat Map and a lot of people are just bee-lining to the about page and then they bounce off. That happens a lot, right? You can speculate as to why that's happening, right? So you identity the problem. That's the first part.
The second thing is, you come up with an inference as to why that might be happening, right? People are doing that because it wasn't expressed on the product page. That might be one thing. It might be because they're just show rooming and they're trying to go to Amazon to buy your thing. That's another possible speculation. It's a little bit harder to address. If you come up with the answer being I don't know, you go back to step one and do a little bit more research. And you figure out, okay well, maybe I need to run a usability test where I go to user testing.com and I get somebody to vocalize their internal monologue as their going through trying to make a dummy purchase on this site. Something like that.
No matter what, once you get to the point where you have enough research to have a hunch about it, the third step is coming up with a design that addresses the hunch. I know this is easy to say as a designer but for me this is the very easy part because I already have some degree of clarity about what it is the thing is and I've come to a consensus maybe with the rest of my team about why it is that why. Because that speculation, that's what it is. You're coming to an inference about it. You're trying to make a conclusion on it and that's scary and possibly unsubstantiated. Once you get to a design solution ... Usually once you have the guess, the design solution kind of naturally falls out, right?
So in this case I might add an assembly guide or a little bit of an about stuffing on the actual product page and I would address it in that way. That's something that I test and I determine if it increases the add to cart rate. The goal is to get people to kind of the next step in the funnel. Of course your tracking other things like ARPU, AOB, that sort of stuff. All the key metrics that you would be doing.
Michael:
ARPU?
Nick:
Average Revenue Per User. That's basically-
Michael:
I can't believe that I have never heard of that metric.
Nick:
Don't worry about it. It's basically-
Michael:
Those guys.
Nick:
Other people that ... Yeah. ARPU, he's my friend. Yeah, he ... No. Check his blog.
What was I saying about that? So basically you take the whole ... Everybody that hits this page and divide it by the amount of revenue that that page generates. Or it's the other way around. You divide the amount of revenue by the amount of people.
Michael:
I know why. Because it's a four letter acronym man. I only remember three letter acronyms. ARPU.
Nick:
Yeah. Those TLAs, man. You're getting all of these metrics back and then you're trying to figure out what the impact of the design decision was.
Something you may be noticing in all of this is that one ninth of the process is in A/B test, right? It's not about the actual test even though that is the sexiest thing and usually why you're hiring somebody like me.
A/B testing is the tool, right? Like, if you're building a house, it would be like focusing on the hammer as the really cool thing and not all of the materials and process and blueprints that are necessary to get to the point where you're using the hammer effectively to create a house that won't collapse or leak. I think that eventually people will kind of clue into this process but for me that's kind of how I follow it.
Michael:
That's really insightful. I used to build houses or I was going to build houses as my career and actually really love framing but bailed on that. But that whole idea of focusing too much on the tool is so important because so much of the time it's like, what's the easiest solution? Is it to use this tool? Is it not? It's so easy to get budget for using some fancy acronym.
What is it this year? Its AI. There's a lot of companies that are like, we need to be doing something in AI. Why do we need to be doing something in AI? What problem are we actually solving?
So I think honestly, I think a lot of the reason why people who actually understand anything about math and statistical analysis, conversion rate optimization and these other disciplines is because it feels like, oh we're going to have numbers, we're going to be able to use these numbers.
What I find is that it usually hits a wall. Conversion rate optimization programs usually get started and then they just kind of peter off and it never goes anywhere. Which, I feel is leaving so much on the table. You know? If you think about it, redo your website lets say and it looks great. Chances that there's a lot of things that if you'd make slight adjustments, things would really fall into place. People would understand it so much better. I'm kind of wondering if that's one of the problems that you solve is helping people not have to feel like their conversion rate optimization is going nowhere. Is that-
Nick:
Sometimes. Sometimes. I mean this morning I gave a client a report that was basically like, here are three tests. They were all inconclusive. I don't like giving bad news. It's even worse to have inconclusive tests than to have outright failing tests because inconclusive tests teach you functionally nothing other than we probably shouldn't do that approach.
A lot of it is, you have brick of marble and you're trying to carve the David out of it. Okay, then carve out everything that's not the David. That seems very counterintuitive. At the same time you're doing that and trying to convey the results of these tests ... Trying to convey the mind set shift that's necessary to think about it in a truly research and design driven way, right?
A lot of people hire me, at least, because they want their conversion rate to go up and I get it, I understand that. That is why I'm here. Hopefully after a certain amount of time your conversion rate will go up. If not, I should be firing myself. It's not just that, you have to kind of come in a different direction from what you're necessarily thinking. It's not a situation where you're testing to settle a debate internally. That is not going to make your conversion rate go up. You're not testing because an agency came up with a comp and you think it's interesting. That's not a good use of my time. It's not a good use of anyone's time.
Testing time is finite because you have one page, usually, and you're testing that page and its load bearing on the rest of your funnel, right? I know that there's a lot of pages in your funnel but you can only test a store page at one time. You can only test a product page at one time. You're running into a situation where if you're wasting test time on this, just wheel spinning that's not research driven, you're wasting time on several fronts, right? It's not even just that you're paying a lot of money on an A/B testing tool but also other people that are competitors are doing this right. They're going to eat your lunch eventually.
Michael:
I have two practical questions for you. One is, I find that ... I have a client where they are amazing at conversion rate optimization. They've taken several years and methodically CRO tested every single part of their website with the result that their website looks horrendous. Now, we're familiar with this problem because-
Nick:
The frankentest.
Michael:
Yeah. How do you address that? Because it seems actually like a lot of people feel like there's this situation where you can either do the thing that's the best for conversions by having flashing yellow banners all over the place to get people to sign up for the email or whatever it is-
Nick:
Please don't do that.
Michael:
Yeah. Or having a pretty website.
Nick:
So, those aren't necessarily mutually exclusive. I think the second question is the difference between beauty and conversion driven design. I'll address that in a moment. Let's talk about the frankentest for a moment.
Part of researching your test ideas is so that you can understand what battles to be fighting, right? Another thing about A/B testing in particular is that you can't cheat statistics. So if you have a test that wins at 55% confidence, that actually tells you almost nothing. So it's not that the neon yellow background won, it's that you had slight noise in your sampling. You need to be getting wins or losses with 95% in up confidence. Even that is conservative for some of my clients. I run most of my biggest clients until 99% confidence because it's a matter of two more days of testing. That gives us more certitude in what we're doing.
Within this entire consideration is you chose to test the background because you hadn't tested it before. That's not what your customers are telling you. Nobody gives a crap about your background, right? They don't care about the individual elements. They care about overall what it is you're trying to solve. They care about probably the text more than anything. In my experience, they care about the usability and the functionality of your cart. They care about the ability to pay you well and the ability to get free shipping and other incentives. That sort of stuff. That's the kind of things that you need to be testing. You can do that on a pretty website. If you are finding yourself running out of test ideas, the answer is to research more and not test things that don't matter. That's my take on that.
As far as the tension between data and beauty and functionality, I mean I even wrote about this on my mailing list a couple weeks ago ... But there's a famous anecdote, in like 2009 or 2010 where Doug Bowman who was a very high up at Google in design there who quit and became the design director of Twitter. He was reporting directly to the CEO ... Whomever it was at the revolving door at the time there. He wrote a thing about why he quit Google and one of the big reasons was that they ran an A/B test on 42 different shades of blue for the primary link color. He said it just drove him batty, right?
The problem is that both Google and Doug Bowman are right. Google is right to be doing that because that will probably set the link color into perpetuity and frankly Google gets enough data that they can run a 42 variation A/B test, right? Doug Bowman is right because if you want to run a classical design practice, that's not the way to go about doing it, right? If you care about that kind of beauty and functionality.
Fortunate thing for you, dear listener, is you probably don't work at Google. You probably don't work at a business large enough to run 42 branch A/B tests. You should be trying to embrace that. So you should be having a style guide in place. Be flexible with it of course. It might be that data doesn't back up having a low contrast ratio or poor functionality or usability and stuff like that.
I've personally found that I can have my cake and eat it too on this front. The best way to go about doing that is ... Shocking no one, through researching customer motivations and realizing that changing stuff to ugly nonsense and doing things that are predatory from a UX standpoint like putting a huge modole in or a huge blinking. It's like a sugar high. You get a short term boost from it and then in the long term it doesn't actually benefit your business and it results in a reduction of credibility or people kind of abandon that mechanism that you've tried as a short term fix because all of the evil people have glommed on to something else. So you look like yesterdays news. I've seen that quite a bit. There are a lot of structural disincentives to do that. If you're gross and unethical, by all means throw a bunch of nonsense on your page and do that. I can't stop you. It makes it unlikely that I'll work with you.
Michael:
Yeah but the physical where we spent months and months and we brought in ... actually a mutual friend of ours ... Rob Williams. We collaborated on this project. He did the most beautiful design I've ever seen on any website. They trashed it. And I just felt like, wow, you're vandalizing your own website. Why? To get a few conversion points? Ultimately you're damaging your brand? I mean they were trying to be a luxury brand and I was like, oh my gosh you guys don't get it.
Nick:
I mean, especially with luxury brands or brands that are meant to communicate with anybody like my age or under. You're hurting yourself really severely if it doesn't play on insta. I'm dead serious. If you're going to go and trash it, that's just a sign that designers should be on retainer to make sure that that doesn't happen. Then they get to be the fun ruiner by constantly defending themselves against ... The problem is probably the toxic culture in that situation where their constantly averse to design and that'll bite them eventually.
Michael:
It will. The funny thing is ... The sad thing I should say, they won't even know it. They'll be wondering after five years ... In a similar way that we all found out 10 years ago with blackout SEO. Don't do blackout SEO. It will kill your business eventually.
Nick:
Google will be very sure of killing your business, right? It's, yeah it'll come back to you.
Michael:
Very insightful, thank you. The second thing is, how can people learn more ... How can they do CRO right? Maybe some of them will want to work with you. I hope they do. I've seen the results that you're able to drive and I definitely want everybody listening to go and see what you offer.
Nick:
You can go to my website at draft.nu if you want to hire me tack a row slash revise onto that. That's basically the quarterly A/B testing that I use. That's draft.nu/revise. If you never want to actually see my face you can go and buy my course called the A/B testing manual at Abtestingmanual.com. That will teach you everything you need to know and spares you the expense of having to work with me.
Michael:
Although if you go through that course you're going to be like, wow whover wrote this is really smart so I should probably, better hire him.
Nick:
That's probably likely. That's happened.
Michael:
Well cool, Nick. There's so many questions I have but in the interest of time we'll have to do this again.
Nick:
I would love to.
Michael:
If... here, we're going to include all of the notes in the show notes. Do you think we could get the copy of the email you mentioned that you put out?
Nick:
Yeah. I can definitely provide that.
Michael:
Alright cool. I'll run a link to that so that everybody can learn from that. Everybody, this has been great. In the show notes you'll find everything. Just go to ecommerceqa.com for those show notes. We've got a little something ... Speaking of research and all that, we want to understand all of the listeners pain points are right now in Ecommerce. So if you're running an Ecommerce store or you're thinking about doing it, what we've done is we've put together a little survey that we're just going to share all of the results with everybody who signs up. We're not trying to push something. We just want to understand what you would like to hear us talk about more on the show. We talk about lifestyle stuff, we talk about consumption psychology, we talk about really practical Ecommerce, strategic and practical matters. Your Ecommerce matters are important. To get to that survey, what we want you to do is go to sellry.com/survey. S-E-L-L-R-Y. Two L's in there, dot com forward slash survey. If you have any questions for us or Nick, send an email to podcast@sellry.com our end or Nick do you like people to email you?
Michael:
Yeah. They can email me at office@draftnu and that goes to everybody in the company, which is a very small company and I'll answer it or somebody extremely qualified will also answer.
Nick:
We're so honored that you could join us today, Nick. I've been following you personally for a really long time. I've learned so much from you and now I've just learned a whole lot more. Thank you so much.
Michael:
Thank you so much for the kind words, I really appreciate it.
Nick:
Yeah, absolutely. Alright, everybody. That's a wrap. Talk to you later.
34 jaksoa
Tutti gli episodi
×Tervetuloa Player FM:n!
Player FM skannaa verkkoa löytääkseen korkealaatuisia podcasteja, joista voit nauttia juuri nyt. Se on paras podcast-sovellus ja toimii Androidilla, iPhonela, ja verkossa. Rekisteröidy sykronoidaksesi tilaukset laitteiden välillä.