Interview with Drew Seman VP of Growth at CroMetrics on Conversion Rate Optimization for Ecommerce
Interview with Drew Seman VP of Growth at CroMetrics on Conversion Rate Optimization for Ecommerce
TRANSCRIPT:
Darius 0:02 Welcome to the retail tech podcast. My name is Darius Vasefi, producer and host of this podcast and today I am speaking with Drew Seaman VP of growth strategy at a company called CRO metrics. This interview is being recorded on clubhouse, and it will be published on my website at retail tech podcast.com in a few days, and we will be taking audience questions if anybody would like to ask drew about what we talk about later on, maybe in about 1520 minutes. Welcome, Drew. Well, thanks for having me. So the the topic of CR o conversion rate optimization is probably on the top three of every marketing and Product Manager these days. Or at least it should be. So I was really interested to learn more about what CRO chrome metrics does. And then maybe we can start from a little background on yourself, and how did you get to be at chrome metrics?
Drew Seman 1:20 Yeah, absolutely. So Chrome metrics is a conversion rate optimization focused agency. We work with about 60 plus clients right now, about half of those are eecom haffer, kind of everything else from been tagged to b2b SaaS, work with folks like allbirds, bombas, Casper, and Clorox. So a huge range of clients that are that are in the e commerce side of things. And really, at our core, we're focused on on CRO on running AV tests on client sites to increase conversion rates, and ultimately drive impact for them. What that means in practice is we're using a platform like Optimizely, or a converter to randomly split site traffic between two experiences, measure that impact and hopefully see see some sort of a lift and create growth for our clients. I joined chrome metrics. Two years ago, I'd actually been on the client side working with CRO metrics for about three years. And ultimately, like, you know, I was running a team at Sierra Club and environmental nonprofit that was working across all sorts of KPIs, and it was working across everything. So pay email, CRO. And ultimately, what I was seeing is, the amount of impact that we were seeing from CRO was much, much higher than the amount of budget allocation I was seeing most organizations give it and wanting to make this transition to being CRO focused, because just seeing that this is, this is such an untapped thing. And to your point, like, at this point, it is top three for most folks, but it wasn't a top priority. years ago, you know, right now, it kind of feels like we're, we're digital was about 10 years ago, where folks are kind of having to have conversations to convince people that this should be a priority for your organization. And we're kind of past that point of should this be a priority. And when we're starting out with prospective clients, and more, I noticed as I know, this should be a priority for us. How do we actually execute this in practice?
Darius 3:18 All right. So I mean, you know, conversion rate optimization might mean different things to different people. Do you find that you have to like, set the definition up with your clients up right up front before, you know, before moving forward to actually starting work with them?
Drew Seman 3:39 We do you know, I think a lot of people, you know, kind of maybe are coming from a background that is maybe more touristic approach or or research base that's more UX research based or something along those lines. And certainly we have components of that what we do, but I think like one of the things that we have to get out in front of our clients and beginning is this idea of building a culture of experimentation, that there's the hopes of the times that, you know, we will just come in and know what all the winners will be and we'll run tests and have 100% win rate. But that's not how this works in practice. So it's really it's about building a culture of experimentation helping clients understand that this is what it means to test we have to let a portion of the audience see one thing of course and the on to the other and see what works for your company. Because every company has a different model your audience is different. The reliance on one time subscription revenue is different the reliance on one large perfect purchase or you know several small purchases throughout the year is very different. All this means that how we actually have to operate and what actually works with each client is going to be a very different thing.
Darius 4:41 Okay, so if if we were to let's say for example, I was a client and I came to I fit the I guess your your customer profile, and we started talking about how you can work with you know, help Plus and you know, we'll do like a retail brand with, you know, online presence. Let's let if you can walk me through some of the questions that you would ask me to try to, you know, determine how to help.
Drew Seman 5:15 Yeah, I think the first thing that we are typically trying to do with clients is doing what we call a swim lane analysis. So we start with taking, doing a deep dive into your analytics, taking a look at your traffic, seeing where potentially the biggest pain points are, and the biggest impact points are, and then really trying to do some sort of a velocity calculation from there. We have got a lot of data showing us, you know, what are what a likely win rate is for clients in different swim lanes, that then allows us to see, okay, if we're able to create an expected lift in the swim lane, how much impact are we likely to drive for your business? And is this something that would be impactful for you. So ultimately, like, that's our, our really only cloud qualifier of a client of the front is ensuring that there's enough of a baseline of traffic. From there, a lot of is trying to figure out how quickly we can maximize testing in those core swim lanes with clients. And that's what a lot of what we talked about with them. And then, but ultimately, like, what it comes down to us is we for us is we've done a lot of research that shows it's not as much about the heuristic analysis and all of that work. As it is getting the client bought in on the idea we need to be testing at scale to drive impact, because we look across our clients, the biggest decider of how impactful their testing program is in how great the ideas were, it's how high their velocity is. So it's, I would say it's much, it's much more getting clients bought into that of the front, and making sure that they're aligned to that concept, that culture of experimentation for us to ultimately be able to succeed with them.
Darius 6:52 Okay, so what would be like the, the minimum traffic that you would be, you know, looking at engaging with the client, oops.
Drew Seman 7:06 I mean, traffic, it can be, I mean, honestly, again, it just comes down to what your business models we have, we have clients that are b2b SaaS clients that are very, very low traffic, what I tend to think about is more in terms of if you had a 5% increase in a specific swim lane, on your site. So if you know, you increase that your checkout a 5%, if you're increased,
you know, and that really gets down to less. And then five to 10% lift tends to be the metric that we use, depending on the sunlight. But would that would that be something that would be a substantive increase for your business? If it wouldn't be that may mean that you have to increase paid investment at the top of the funnel to get there, but a lot of that then gets into what our next step conversation with folks are, we have some clients that are pretty early stage, and we'll will often be doing in those cases is working much more closely with their paid team, because often what they're trying to say, Hey, we're not at a point, we're able to scale paid today, because our conversion rate is not high enough. So then what we're ultimately doing is being like, Okay, what, what numbers do you need us to get to, so that then your conversion so that, then your pay program can scale? And that will allow us to, you know, start slower with testing to get those initial lines, then allow them to scale that paid program? And then we can obviously be testing at scale with them at the same time.
Darius 8:33 Okay, so So would it be probably accurate to say that that initial conversation and assessment is a combination of traffic plus, like an LTV of a typical customer?
Drew Seman 8:49 Yeah. Yeah, absolutely. Traffic LTV, and then you know, the conversion rate of those things. Ultimately, those are the three things that we're looking at the most.
Darius 8:58 Right, yeah, I mean, if you if you convert only five customers a day, and each one is worth a million dollars, it's different than right, you're
Drew Seman 9:06 right. And we have b2b SaaS clients that that's effectively how they're operating. Exactly.
Darius 9:13 Right. Right. Okay. So let's talk about the swim lanes. What are some of the typical swim lanes that you see people have? I mean, it's like the checkout flow, purchase path or what what are those?
Drew Seman 9:27 Yeah, like the most straight, the most straightforward ones to be like each step in a checkout funnel, the cart and you know, obviously, card can mean a lot of different things, whether there's actually a cart page or if it's just some sort of a flyout cart, the PDP page, the category pages, and the home pages tend to be the places that for an econ client that we're looking at the most. You know, we tend to avoid things like you know, the About Us type pages or the blog pages or things like that, that in some cases, a blog page may be very high traffic, but for most clients, we don't I'll find that there's enough conversions there. But really, we're trying to look at that, whatever that funnel is. And that funnel may for some clients be very organic, driven and started at homepage for others, and may be very page driven and started either category page or an individual product page. But really then just looking at that funnel, and what what we can do at each step concurrently to maximize revenue for them.
Darius 10:24 Okay, what about optimizing search? Is that something you get into?
Drew Seman 10:31 Yeah, so we we are not a SEO agency, but very, very quickly are partnering with clients who SEO is a priority. What we are often doing in those cases is we've been clients will make a lot of changes to be to their pages to increase SEO. And really our role there is to see if those page if those changes that they want to make for SEO reasons are likely going to be positively or negatively impact the site experience. And if so, if it's worth it, so, you know, there's there could be opportunities, or somebody could make massive changes for SEO reasons, could double traffic to the page by could cut conversion rate by by 75%. And then suddenly, it's not worth that worth that change. So what we're trying to do is help people make more informed decisions. So ideally, in those cases, what we'll often see is like, Don't worry, this is a flat test, this is not negatively impacting user experience, you should move forward with these SEO changes, or in many cases will actually see that the SEO focus changes are actually increasing conversion rate, which is even better.
Darius 11:34 Okay, yeah. So I guess I should have qualified What I meant is on site search, but you gave a good context, as far as the off site search as well. That's fine.
Drew Seman 11:45 So, yeah, so on site search, we do some of it sometimes, you know, I think the difficulty with on site search can be that, you know, we try to be very, very funnel conversion driven. And sometimes with Site Search, it can be a little bit more difficult to see exactly what you're finding here. To know whether or not this is improving things or make things worse, because it's not always quite as directly tied to conversion, depending on depending on the client. So it's a little bit less of what we do in practice, just because the learnings are often a little bit less black and white than some of the other swimlanes.
Darius 12:20 Right, right. And I don't think any of the tools also get involved with the search aspect of the like, you mentioned Optimizely and convert.
Drew Seman 12:29 Exactly, exactly. So like, you know, you could theoretically do something where you had both search tools present on your site at a time or something like that, and run one experience the other but again, it starts to get it starts to get clunky pretty quickly in those cases.
Darius 12:44 Yeah, yeah. So yeah, I've worked with convert before I've actually like, you know, we hired an agency that use convert, and it was, it was pretty good. I haven't worked with optimizer Is that pretty much like the difference is the size of the client, I optimize these more like enterprise these days.
Drew Seman 13:00 Yeah, I would, I would say Optimizely tends to be more mid market or enterprise. You know, certainly, we also have some enterprise clients that are using like an Adobe target type platform. But typically, what we find is that, you know, if you were, if we were to simplify it, our larger clients tend to be on Optimizely. And smaller clients tend to be on a platform like convert and practice. But we can certainly get results in impact with, you know, almost any platform that a client is using, we have a pretty, pretty robust engineering team on our side. And so you know, with platforms like Optimizely, and convert that they've got a ton of reps on that's allowed us to really kind of pushed those platforms, to their max in terms of our ability to do stuff that's moving, you know, far beyond, like a button color test or something like that, like one of my, one of my favorite examples of a client is that we, for a streaming service client, they wanted to test it, people were after they'd got through their car on their right after they first signed up for the service, how to get people to add things to their queue, and entirely via their testing platform, we were able to build out an experience that people could actually do, we call it the Tinder test, you could swipe left or swipe right to add things to your movie who did it, and then it would actually actively add them to the queue. And then once people, you know, we're our actual KPI for the test, though was, is this actually increasing people getting past a seven day trial point. So you know, be able to build all of that experience, the down funnel metrics, the actually adding it to your queue, all of that within a testing platform is much, much lower lift for a client engineering team, than having to build something like that, you know, hard coding it on their site, that's going to be, you know, an entire sprint cycle for the client to then know, is this actually something that's going to be impactful for us or not? Right?
Darius 14:54 Yeah, I mean, these I mean, just from what I know, like, just convert that's definitely not something You want to build or even start trying to mimic yourself? So, totally. So the other thing that I've experienced myself is our agency would come up with ideas on a regular basis, like so we would have like weekly review. And we would go over the tests that are being run, and then they would come up with, I get other ideas to optimize different things on different pages is that pretty much the same thing that you do?
Drew Seman 15:31 Yeah, so as in practice, we typically are starting with the client, again, most of our site is building that called for experimentation. If we look across the industry, tests have about 3020 to 30% win rate, our our one way for clients is about 35%. But ultimately, like our goal, as an agency isn't to focus on the win rate side of things, our goal is to be focused on what is creating the most impact, you know, no company, no company has made millions of dollars, because they had a high winrate that and they made millions of dollars because they increase their revenue. And so if I think about how we approach a client roadmap, we think about this idea, this fat tail distribution of outcomes. And what that means is, in practice, we see that, you know, most experienced experiments are flat or have no impact. But at the far end, there is that fat tail of experiments, that it's about 5% equates to 91% of all revenue impact for our clients. So the interesting thing that we find is that we're actually very terrible at figuring out which tests are going to actually be that, you know, thing that is really moving the needle, you know, 10% 20% plus winners for our clients, we're pretty, we're pretty good at coming up with the roadmap. But coming up with those specific ideas is very hard. That's not really something that anybody has a has a corner on as a market. So what we really focus on is getting the stem swimlanes as full as possible for our clients. At the start of our engagements. Typically, we see clients have low hanging fruit for that reason, we see about a 60% win rate across our first 10 tests or so for our clients. That's really pulling from what we're seeing from has already won for other folks. But then once we settle in, really what we're focused on is we do typically about a monthly ideation session for our clients. But what we really try to try to drive home for them is, it's not about having the perfect idea, it is about ensuring the tests are getting live, we do not want our clients to be looking at a testing program, not testing something because they're waiting for a better idea, because people are not good at knowing what a better idea is. So we ultimately try to think about it in terms of, are we not in terms of do we are we could we have a better test? But are we ultimately wasting a day by not testing?
Darius 17:52 Okay, so, so the the so the cadence of testing is where I think you are referring to as a culture of experimentation. Let's talk a little bit more about that the culture of experimentation. What is it about the culture when you start that needs adjustment?
Unknown Speaker 18:15 Yeah,
Drew Seman 18:15 I mean, I think I think a lot of it is that cultural experimentation is for a lot of organizations a pretty big shift. They haven't, and this goes to, you know, historic stuff that goes way before digital like, and, you know, with traditional media buying, etc, the opportunities to be testing in different ways are pretty limited. So, and in your, your kind of a brick and mortar store, your opportunities to test in store are relatively limited, although that's changing now, too. So you didn't have this opportunity to have this culture that was experimentation based in the same way that you do today, at this point, so for a lot of these people, it is like a real sea change in terms of how they're thinking, especially somebody that might be more senior at an organization who may not have grown up in this culture. So really, what it becomes is it is less about trying to explain it was less about having the corner on good ideas, and more about having, having the understanding that we need to get all these ideas out there to understand what's better. One of the best examples, I think, is how we think about a website redesign process. We have a lot of clients that are in some stage of a redesign process. Everybody has been in a redesign process where there is stakeholder one that thinks, you know, the navigation should be blue and stakeholder two, the things that navigation should be read. And it can take months of an organization's time to decide something like that. What we find is that we're like what we try to ingrain people's the idea that that doesn't actually need to be a fight internally, those everything that is something that ends up being these organizational battles, can instead be a test that we're trying to show for impact. Another one of like a great example. Another one of our clients is that they had this huge internal debate about whether or not they could succeed instead of being a one. Time purchase for their econ product as being a subscription service and what that impact would be. Did you know tons of modeling tons of presentations to try to convince people, but by doing, you know, one painted door test that actually showed them the revenue potential, it actually that is what actually convinced people to move forward with this subscription based model. Because they could actually see, okay, this is this isn't somebody just trying to convince me of something, these are real dollars that we're leaving on the table. And that's not something that people have been able to do as cheaply really easily as they, they can do it right now. So it's ultimately about just it. Ultimately, when you start shifting this mindset, it changes the way you think about all of your different channels, all of your different business decision making. That kind of can reverberate across the organization in really different ways.
Darius 20:49 What do you find are the biggest missed opportunities in ecommerce that you see when you start working with customers? And I mean, like, maybe this goes to the swim lanes?
Drew Seman 21:05 Yeah, so I think what's interesting is, we kind of see different things in different places. I think, one, I think I was just starting to touch on it. But I think the biggest thing that we see is that yes, you can with different types of couponing and such, you can certainly drive impact with things that are different couponing at the top of the funnel. But sometimes that can be short lived. Because you know, might be a short term offer a black friday deal, whatever it may be, what we try to think about, especially the top of the funnel is what is something that is going to be longitudinally impactful for this organization. So for example, for some of our e commerce clients who tested instead of a coupon code at the top of the site, that is, you know, enter code XYZ, highlighting that it has already been auto applied to their cart, auto applying to cart, we had an econ client last year, that saw a 7% lift in revenue by having this thing auto apply, ultimately. So really, it's trying to figure out what those longitudinal differences are, how you're approaching, I think, at the top of the funnel, how you're approaching offers, like that is a very critical thing. The other half of those offers is whether or not it should be a one time or a subscription service in terms of how you're framing it. And especially thinking about that on an audience level. One of the most interesting counterintuitive things we've seen for some clients is, there's kind of a desire to get people in the door at one time, and then keep them going in, you know, a second purchase, let's say with something that might be more of a subscription purchase. And what's really interesting is we've actually had some clients that have seen the inverse, where they're actually much better off getting that person that's their first time purchasing, getting them to be a subscription person upfront, because their chances of getting that person to come back a second time may be so low, depending on the product. So we've had clients that that's effectively redone their entire paid model, because now they're having such a much higher upfront value for these customers, that they're able to increase their cost per acquisition and ultimately drive more revenue. Down funnel, I think we start to see some differences. There certainly is UX stuff. At the end of the day, so much of it is about limiting the number of steps that people take, you know, if we can skip people taking a cart, step for a cart step with the car, kind of getting them right into the, into the funnel, instead of the additional step. that's hugely valuable. But the other thing that we really think a lot about is, what are what are those ultimate barriers in terms of things that are making film beside that they're not sure if they want to make the decision today. And that's where things like pop ups as people are exiting the car, we've seen a lot of success with, especially later and experience that highlight. One, if they have a promotion today to have there is free shipping today. Three of them a free return policy, getting those sorts of things as things that are getting that conversion to happen today are extremely valuable as people are getting later in the funnel.
Darius 24:06 All right. What about platforms? You know, of course, every customer is going to have probably a different platform, right? Yes, e commerce platform. So you you pretty much work with all of the common platforms like Shopify plus, or
Drew Seman 24:25 Yeah, we are ultimately agnostic. We have a ton of clients on Shopify plus, but we've got a lot of clients elsewhere. Also, I think, what's really exciting about when we have a client on one of those platforms that a lot and some people have completely custom built. But one of the exciting things that we have somebody on something like Shopify is that we have some of these tests that we know have already worked for clients on Shopify. And what that allows us to do is and we are obviously our engineers have already built it on Shopify. So that allows us to build these experiments much faster for clients than they otherwise could be built. That allows us to deploying great impact for clients much, much faster.
Darius 25:06 And then the other thing that the other question that I have is the impact of content. Do you also work with content on the page?
Drew Seman 25:17 Yeah, we do. And I think what's I think what's really interesting on the content side is that this is one of our best opportunities to also then connect with clients paid and email teams to understand what type of messaging is working for folks, and what type of products are working for, folks on the content side, you know, we'll have a lot of clients, one of our one of our things that we talked a lot about is the idea that clear beats cue, you know, have a lot of a lot of clients that may be coming in with a more brand oriented messaging on the homepage. And we will do a lot of testing to show that people actually need more product, clarity, and aerospace. And so creating product clarity can go a couple different ways, it could just be explaining the product. But it can also mean doing a better job showing them the product. So for example, we've got have had several cases of clients that might have more of a lifestyle image that may show several of their products in the aerospace or just one of their products in the aerospace. But by taking that hero space, and actually slicing it up side by side, so that we can show, you know, several products, like one of our clients is a is a is a company that makes printers and fax machines and all sorts of stuff. So by by taking those, there's home their homepage, and instead of focusing on one product image showing a printer, taking that hero space, showing all of their products much more side by side, we're able to drive home much more clearly what their product actually is. And that this is actually something for you. doing one of the limiting things can be if you're only showing a portion of your product above the fold, you're not helping enough people understand that this is a product for them and may only be a product for somebody else.
Darius 27:01 Yeah, so I think you know that that's like one of the the key challenges right now is to optimizing the content, both in intent in terms of text and also visual. And that's, that's a, that, you know, a lot of art in there, then science?
Drew Seman 27:19 Yeah, well, I think that that's where that's where we kind of get that kind of tie of art and science together, though, because we are able to, you know, there is that I think that like in the doobly save, of course, the big runs has run as a test if it's going to do better or worse, what what customers tell you in a focus group may be very different than what happens in in reality. So you know, we are very, very frequently finding that what what user research may have told people is different than what the actual customers are doing in the site. So what we would do in those cases is maybe take the art of the copy that you know, one version that might come from the copywriting team, one version that might come from them from a marketing managers, one version that might come from an art director, and just run them all side by side. And then hopefully, we can get a better sense and like, take everybody's art and see which which person's art is actually doing a better job of converting customers. Right.
Darius 28:15 Yeah, yeah, that's I mean, that's a it's a very detailed work. And what's interesting is that you never, like you said, you never know what's going to actually work before you try it. So totally.
Drew Seman 28:31 Yeah. And that's, and that's so much of what we try to tell people is like, you know that the hours to build a small text change experiment are, you know, let's say, three engineering hours and QA hours, maybe a little bit more than that appeared to depending on the location, to build an experiment, to bring the whole team together to discuss which of these two we should move forward with that experiment, is going to take away more than that number of hours to get to that point about internal debate. And then once you're done with that internal base debate, there's a 50% chance that you're wrong. So, you know, by starting with that experimentation, first approach, you're hopefully actually saving your company a lot of time and energy.
Darius 29:12 Yeah, so and that's interesting, you know, again, going back to the the culture of experimentation and the mindset, you know, a lot of companies, you know, people get into these review meetings and just start shooting out ideas, without really thinking about what it takes and what the metrics are and what they really are just like so actually, in our studio, we have a, you know, startup studio, we built a product that's designed to help teams think about ideas as experiments. When you change that language and start to ask like maybe three or four common questions for every specific recommendation, then I think the ideas become a lot more high. Quality and less quantity. I mean, again, it's it's not you want people to give ideas, but you don't want a million ideas that are just half baked?
Drew Seman 30:09 Yeah, well, I think the interesting thing that I can also do is it can potentially bring other people into the conversation, if it's, this is what we're going to do is how it is approached, then it might be something where the highest paid person's opinion starts to come in more. But if it's mentally approached as an experiment, that might, that might make somebody else in the organization more comfortable sharing an idea, you know, that they may have been a little afraid to share. And that's why ultimately, they're like, into your point of this kind of half baked, like, what's the goal and everything. That's why when we're doing ideation sessions of folks, we tend to think about things as, as we're centering the conversation, we try to center on a where to test. So you know, if we know that this page, or this, or this, you know, section of a page or whatever, maybe is the core place to test. That's what we want to save the ideation session on, you know, if you're, if you're being completely free wielding, that's where you get, you know, I want to change the footer on a blog blog post. And I think that would be a good idea. But if you start with the premise of, you know, in this cart experience on this page, we need to do something that is going to increase the number of people that are getting to step one to step two. At that point, hopefully people have enough of those guardrails, that you're that you're hopefully going to get higher quality ideas.
Darius 31:27 Yeah, I like that a lot. Is there is there so I think that might be related to what they're talking about. But is there something as such as, like, too much experimentation? I think
Drew Seman 31:41 at the end of the day, I would be hard pressed to like the number of organizations that would be at the point of too much experimentation is probably very few right now. I think the things that we actually guard against are more so end up being waiting for too much statistical significance. That's where we see most of the the paralysis happened. And you know, so you've got to test that is effectively a flat outcome with it plus or minus 1%. In terms of how it's going to perform. It's at 80% statistical significance. And people want to wait to be like, No, I want it to be at 95% statistical significance. So I know how well this did. But you know, what our fat tail tail distribution shows us is that we actually don't want to wait for that long. Like, if it's plus or minus 1%, then it's probably all it showing you is it's not that impactful, either way, most likely. So we could actually be doing is making the call when the experiment that 80%, or whatever it is today, and start to move on to the next idea. And that's kind of how we're ultimately hopefully unlocking the most velocity for our clients without letting this idea of like, wanting to know exactly what the best decision is, you know, down to point 1% decimal places, because that's, that's, that's how you're going to limit your growth is by not spending more of your energy figuring out things that are going to be 10% to 20% plus growth.
Darius 33:03 Yeah, I mean, that level of certainty is actually a counter to their culture of experimentation. You're, you're looking for the expert, you know, certainty. Yeah.
Drew Seman 33:15 Yeah, exactly. I think and I think that that's, that's a hard thing. Because when you say like, you know, you say, Well, what do you mean, we're not wait, 99%? Does that mean, we're not rigorous? And I think that that's like a question that people can start to get in their head, is it? Is it not rigorous if we're not going that high, but you know, our goal is, we're not, you know, this is not a pharmaceutical trial, this is increasing revenue for, for an e commerce company, like our ability to take more risks to hopefully find drive more impact. But the next thing is just there's a much, there's a much smaller opportunity, there's a much greater opportunity cost to not just moving forward, versus, you know, all the sunk costs that might go in with many other things. Because, again, we're talking about a few engineering hours to spin up the next experiment.
Darius 33:58 Yeah. Well, I mean, personally, I like to start shifting traffic to the winning variant. When I even see something like in the 60, no approaching 60, especially if I can if there is a clear revenue impact. Yeah. So why wait, I mean, why wait, that, you know, those few days could be waste? I mean, last opportunity, like you said,
Drew Seman 34:28 exactly. And that's something that we do even more with our clients, you know, during holiday or what, you know, if they have some other peak cycle, given the nature of their business, but that's something that we're consistently doing for clients and those bases starting to move the percentage of the audience seeing each experience. I think the other thing too, that can come up during those peak moments is the idea that like, this is a unique period. So what customers are doing on Black Friday is likely different than what they might be doing, you know, on June 17. So we're in that, you know, we're not necessarily Looking for something quite as longitudinal that is this going to be statistically significant terms of impact throughout a year? And it's more about in those modes figuring out, is this going to maximize revenue today?
Darius 35:12 So I would probably make a guess. And you can tell me if I'm wrong, but your clients are, I mean, pretty much are all familiar with the concept. So you're probably not running into too many people that it's like their first time getting like hiring a CRM agency. So yeah, is that correct?
Drew Seman 35:33 I would say, so, in terms of our clients in terms of, you know, the main point of contact, and you know, maybe their, their manager who may be a CMO or somebody like that, you know, the CMOS are typically all familiar with this type of work at this point, and are comfortable with it, I think, ultimately, where the kind of the unknowns come in is where we might be a little bit new, as if we're talking to, you know, a CFO or somebody that is in a different party organization where, you know, they just haven't had this level of exposure. And that's where I think like, the clarity in terms of range of outcomes, you know, is a little bit different, you know, if you're talking to a paid agency, they're able to have a pretty a pretty clear graph, in terms of this is how much you spend, this is how much revenue we create for you. Because our because experimentation as the opposite tunity to be a 1% impact on on results, or, you know, a 20% impact on results in a given month, are the certainty of outcomes is a little bit different. And so it's a little bit of explain to folks how this works. And that, you know, week to week, there might be a week where we drive no impact. But that, you know, just means that we happen to not have a winner this week, and we'll have a winner next week. So it's just a little bit of a mindset change, sometimes for folks that are outside of that core testing phase. And really what we see is by really the six month mark, with clients that those barriers start to go away, folks have had a couple cycles, they've seen that, okay, yes, sure, we didn't get a win month one, because, you know, math suggests that not every test is going to be a winner. But I see that over the course of you know, if we blend this six month period, how much impact this testing program is driving for us?
Darius 37:15 What's the like, a really good conversion number, where you say, you know what I don't, we don't really think we need to optimize this part or this experience anymore.
Drew Seman 37:29 Honestly, I we're yet to really experience I guess this is already at 100%, we might not worry about it. But at that point, we would probably say, hey, maybe we need to do something with the pricing side of things. Because if you've got 100% conversion rate, maybe this thing's a little bit too cheap. So we really don't have that because, you know, again, those, those high conversion point things, you can still see how a 5% increase can be really impactful. So that would ultimately be it, if there was something about the mass of the value of that conversion. And the conversion rate that suggested if we saw a five to 10%, lift, this would not be something that would drive more money for the business, then we might avoid it. But typically, typically, that's not places that are high conversion point. Typically, that's tends to be more places that are at a, you know, a no conversion point.
Darius 38:19 Okay. I don't know if you can, I guess share some industry wide a generic numbers and get I mean, I'm sure every site is different. Every product category might be different. But what are some of the, like, holistic conversion rates dime, this is a sharper customer coming to the homepage, or to a landing page, and then making a purchase. Like that's the overall conversion. That we can say, for example, like, this is not really bad. This is average, this is really good. It's like 101 percent is bad. Or like,
Drew Seman 39:01 yeah, yeah, so we typically try to actually avoid those types of models for the reason that, you know, every company is so different. And we don't want to give people the wrong idea that either their conversion rate is good for them, or their conversion rate is bad for them. Because we haven't tested anything yet to know, this is this is this could be this much better than it is today. So it tends to just not be candidates, not something that we have that much in our language, the idea of is this, because every company is so different, and the opportunity with each client ends up being so different.
Darius 39:39 So So I guess I'm, I'm looking at it from a, let's say, a founder or a, you know, cmo point of view, in thinking that you know, my conversion should be at because then I need to do that. So I can add You know, estimate and budget for other things. So if my conversion is 2%, for example, I mean, so just so I can build some models, right? If my conversion is as 2%, based on all the other factors that I need to take care of, can I actually build a business?
Drew Seman 40:20 Right. And what I, what we would typically say is like, what we review or testing program is, if you viewed it as we're able to create for you a five to 10% full funnel with, is this actually going to be an impactful thing for you to invest in is conversion rate optimization, if you look at your and that really can go step by step. So you know, and kind of ultimately then be a potential to be a multiplier that we often get questions from clients, you know, am I am I are these tests actually all cannibalizing each other, we have a lot of data that shows that cannibalization tends to be relatively minimal in most cases. But you know, if you, if you increase by 10%, the number of people that were getting from your homepage to your P, P page, if you increase that by 10%, the number of people that went from PDP page to actually getting into the car, if you increase cart checkout rate by 10%, to in the checkout like 10%. And if you completed checkouts by 10%, if you're increasing each of those by that type of range, is this going to be impactful thing for a business, then yes, you should be investing in conversion rate optimization, because you're likely leaving a ton of money on the table, but we see in practice is that we are typically our clients highest return on investment of their various marketing channels, as long as they've gotten the scale for that type of impact as possible.
Darius 41:43 Okay, now, let's talk a little bit about your own work. What does the VP of growth at I guess, CRM agency do?
Drew Seman 41:55 Yeah. So you know, we have, I lead a team of 30 growth strategists and program managers and directors that are working across all of our clients. And, you know, really my my day today, I would say, half is like setting up the team for broad, broad success in terms of our tools and how we're actually working internally. And half is helping folks kind of think, across the various clients. So you know, we see a lot of learnings across different sectors, you know, for example, we've got clients that are at their core subscription service clients, like home chef, the meal cook company, or stars, the the streaming service. And what we see is that a lot of subscription service learning as more eecom clients are embracing subscription models, those learnings about how to message unsubscribe pages and things like that, that we end up wanting to ensure that reporting over so a lot of my role ends up being ensuring that we're taking the learnings from one client and ultimately applying them to the other. And then the other side is making sure we're doing all these things that are setting up the organization for growth, they work a lot with our internal team, one of the most exciting things that we've done with our VP of operations is we actually have for all of our clients, like an entirely proprietary dashboard system that we've created, that allows clients to see the results of every test, they've run through approvals and the clients because, you know, if you go into if you've ever I'm sure, since you've been in convert, you know, if you imagine sending that to the CEO of most companies, you couldn't send them a convert login and expect them to see understand what you're doing. But we've created a dashboard that allows everybody to see every single test that we've run for them, what tests have won, see screenshots of what tests have one, so that then we're hopefully allowing people to do a better job dispersing this information throughout the organization, ensuring that there's not you know, this learning losses, different people may come in, on and off of a client side of things. So, you know, again, that's ultimately all stuff that helps support creating that culture of experimentation across all of our clients.
Darius 44:05 Okay, so, so that was good, because I was wondering, if you also practice what you do for clients, for yourself internally within your team?
Drew Seman 44:17 We, yeah, so so we do we do a lot of, I would say, we're very big on this idea of creating different experiments internally. So you know, I will, you know, we will do experiments internally, and like, run them for a short time, for, you know, just how we want to think about, like some, you know, miscellaneous internal process in the organization. And I think what's really exciting about that is, is that it means that, you know, if we roll out something that might be more of a HR type change in the organization, we're not going to just people don't view that as, Alright, we're setting this and we're forgetting it and this is how this organization is going to work till the end of time. We actually, you know, check in with people on a regular basis. See what Working, see what isn't, and then potentially make a lot of organizational changes, so that we're just constantly improving our culture. And you know, because we have an entire team of people that want to be testing with clients, that's actually the type of people that we attract as employees too. And so that's more aligned with their expectations. If there's somebody that wanted to client culture that was set in, forget it, they wouldn't be testing. If they were somebody that wanted a internal organizational culture that was more set and forget it, they wouldn't want to work here. So it's ultimately something that like our organizational culture, and the culture we're bringing our clients is hopefully feeling very similar.
Darius 45:40 Okay, that sounds great. Can you share a couple of like, really interesting tools or resources for people who want to get better at Yarrow can check out? Yeah, I mean, certainly, you know, we have
Drew Seman 45:57 what I would say, first, we do have a blog at prometric Comm. We have a series that we have a couple series that we do that I think are very interesting one we do, should we should pass that where we will pick a random company each week, based on learnings that we've had for another client suggests something that they should be testing on their site. We're very heavy into eecom clients there. So it's very useful. The other thing that we do that we post on our site, and every week is, or every other week, excuse me, as a test showcase, where every week, or people bring in different tests that they've run for clients, take those tests, and then really see, and then they ultimately present them internally to the team. Ask people on the team which one which didn't. And then people vote on whether or not they think which one one. And what's really interesting to see is how terrible people are predicting which tests are actually winners, and by how much in a wealth test for everybody thinks the test loss. And it turns out as a 20% plus lift in revenue for our client, and I think it's stuff that ultimately helps people see why this culture of experimentation can matter. Much more so than like the, you know, some of the specific test ideas, which is certainly also important. Otherwise, like I think the other one you we've talked about optimized if it's certainly optimized these blog blog, and they have tons of resources and trainings on their site that are also hugely valuable.
Darius 47:23 All right. Um, well, I guess this I probably know the answer to this. Everybody is hiring these days. So Are you hiring and what are you looking for?
Drew Seman 47:37 We are we are hiring already, we're actually growing really quickly. If I looked at, you know how we were doing a few months ago to today, we were up. This is in our job posting. So not sharing anything I've turned but like, we're we're about 50% from where we were, last fall. So we're growing very, very quickly. And that's really again, as to all this like to what we were saying earlier, the degree to which people are embracing cultural experimentation. Right now, we are hiring growth strategists. That's like a growth strategy as a senior strategist, a director of growth strategies that are ultimately leading that in client experimentation. From the strategic side. We're also hiring program managers, which are kind of some people might think it was more of a product manager role, where they're in charge of all the logistics, you know, working with our engineering team, the clients engineering team, getting stuff through the approval process. And then we're also hiring for our engineering and QA team. So yes, we're effectively hiring all roles right now.
Darius 48:39 Are these are you going back to the office? Or are these remote?
Drew Seman 48:45 We are not? Yeah, so crow metrics. We are an all remote company. Everybody here is today, at least as us base from Maine to Hawaii. culturally speaking, we do some interesting stuff where we then we have one to two. When it's not right now, obviously, one to two all hands that are in person each year, where we all get together as a group is more of a team bonding activity. One of them is typically domestic within the US, the other one will always go to eastland O'Hara, which is an island in Mexico off the coast of Maine. Kuhn is a group. And, you know, we find that those are like we've actually put a lot of energy into building a really strong remote culture with high levels of appreciation and channel. Also a lot of different groups that folks can be in based on interests, from Taco Bell to trivia, in terms of different interest groups that find time to connect with each other. But we still find that getting that in person time to actually connect once or twice a year is still just hugely valuable for us but you know, we are but we find that at the end of the day like we've been remote since day one. We've nobody in our company is co located at all and it's it's really felt like you know, certainly these last days Two months has been a huge value add for an organization. And we think, you know, moving forward, it's only going to be more so. And I think like, one of the great advantages that we have is that, because we've been doing this for so long, we're not experiencing some of the hiccups of what it means to have some local co located staff, some folks that are remote, all these various hybrid models that are folks are throwing out, you know, our our HR folks and operations folks haven't had to waste a day of energy, you know, figuring out all this stuff that is consumed so many organizations for so long.
Darius 50:33 Yeah, I think that's, that's definitely a superpower to be able to run a successful company remotely. You mean, you're definitely winning? More better quality candidates? Probably. Yeah, that's great to hear,
Drew Seman 50:50 I think, I think what's been the most interesting with the quality of candidates that we do and employer, but you have an employee referral program, and, you know, everybody knows really good people. But oftentimes, those people may not be right where you are today. And if you can take your best people and find, you know, they're the people that they know who are the best, that's often a recipe for success. And because, you know, we're a 50 person company, if we were only narrowing down on, you know, one city that we are in, the ability for us to do, that would be pretty minimal. But it's ultimately allowed us to have much stronger hires, because we have all of that flexibility. And, you know, what we're seeing today is also the added layer of more and more applicants today than ever before, because there's these people that, you know, their, their situation has changed, or their company is changing their policies, and they don't want to go back to the office after how their experience has been for the past 18 months. So we're seeing more and more applicants from traditional and office companies trying to make the jump.
Darius 51:56 Yeah, I just read something, I think it was in all three journal or something that, you know, a lot of people are just quitting, instead of going to back to the office. So, yeah,
Drew Seman 52:08 yeah. Well, you know, and that's, I think, what's I think that that's been, you know, interesting for us, we just had somebody who joined the team from being on the higher education side of things in administration position. And, you know, these are, these were positions that kind of, were some of the earlier ones, if you will, to be going back to in person. So, you know, I, we're still so far away, I'm in the Bay Area. And here, we're still so far away from being full time in person, it's going to be really interesting to see what happens, you know, there's this idea that it's really peaking now, but it seems like this peak is probably going to last six to 12 months, because we're nowhere close to the point that, you know, all these Bay Area companies are asking close to everybody to be back in office, or even if they have said that they have some sort of a flexible arrangement, people haven't had a chance to see what that actually feels like for them in practice. And then seeing, okay, yes, I'm allowed to be remote. But I feel like a second class citizen, because I'm stuck in this other location. All these other people are in the office and getting to meet with each other on the side and everything else.
Darius 53:14 Yeah, it's really interesting how, you know, the new, like, live video, or video, live streaming technologies are gonna help people like in a company connected better on a personal level, when they are remote. And I think, you know, this pandemic, just like really gave a kickstart, to a lot of these new technologies. We're not done yet. I think we're all just like experimenting right now. But I think there's definitely a lot that we can do with technology to connect better. So it's good to see companies like chromatics, that are actually doing this so we can all learn from them.
Drew Seman 53:58 Yeah, yeah, I mean, even down to the fact that, you know, we typically would have gotten together for one of our offsets last November. And instead, Chrome metrics, bought everybody in Oculus. And then we had a couple days of playing Oculus, doing Oculus games with each other. So, you know, just finding different interesting ways to connect, I think the thing that's going to be interesting to see is what, you know, which of these, which of these end up being gimmicks and which end up being things that are, you know, genuinely impactful things for years to come for companies.
Darius 54:28 And that's where the culture of experimentation comes to
Drew Seman 54:34 bring it back end of the day, I agree. That's where that's where a lot of that experimentation is gonna come in. It's gonna be interesting to see, you know, again, for the especially for, you know, companies that have been together bitter brown for 100 plus years. It's certainly something that we've seen impact on with the testing culture with marketing programs. But now you're talking about HR program of HR departments and other departments that maybe have less experience with a culture of experimentation. But maybe that's a very that's the next step for us to get into in terms of culture of experimentation.
Darius 55:07 Yeah. All right, one more question. Before I let you go, what are your thoughts about hiring consultants? for yours for the, you know, for chromatics? Do you use a lot of consultants? Or is everybody like just the employees or?
Drew Seman 55:21 Yeah,