Everybody wants to propel their business to gargantuan levels of success, but instilling a strong culture of growth that motivates each person in the company requires very deliberate action and a comprehensive strategy. In this session Sean Ellis, Founder and CEO of GrowthHackers shares his advice to cultivate growth (hint: you need to do A LOT of testing). For Sean, this means writing down and creating a backlog of ideas, and testing them within the ICE framework, which allows you to rate an idea based on its impact, the confidence it gives you (how much data is there to support it?), and ease in which the test can be run. Sean also covers how to create a dedicated growth team and what characteristics the “growth master” should possess to ensure growth remains a top priority without interfering with other projects.

Check out the full transcript below! You can see Sean’s slides here.

If you want to see more sessions from 2016, we’re releasing a new one each week. Subscribe here to be notified. And be sure to grab your tickets to the 2017 Annual NOW.

 

 

TRANSCRIPT

Alison Wagonfeld:  We’re going to go right into the next session about, building a company wide growth culture. I’m Alison Wagonfeld, Operating Partner at Emergence Capital, a venture firm focused on enterprise cloud companies. Now I’m pleased to welcome Sean Ellis, founder and CEO of GrowthHackers.

He will share with you some background about himself. He’s had great roles in companies that you know well like Dropbox, and now starting his own company to share a lot of the learnings that he’s had over the years. He’s going to take you through some slides.

Then I’m going to ask him some questions. Then we’ll open it up again for the last five to seven minutes of questions from all of you. With that, I’m going to turn it over.

Sean Ellis:  Thanks, Alison.

Hi, everyone. Yeah, I’m going to give a quick introduction to Building Company Wide Growth Culture. Then we’ll go into some of the challenges of doing that with Alison following that.

Everyone here who isn’t talking is looking to build a growth engine that gives them explosive growth. Of course, that’s something that we all want. It’s actually gotten a lot harder in recent years than people realize.

One of the big reasons that it’s gotten harder is that 10 years ago, when I was at LogMeIn, running the marketing team, we were able to spend millions of dollars a month driving growth, but there was about a third or less of the money chasing each person online compared to how it is today.

We could invest money in growth, get a fast return on investment, keep cycling that money, build a team around primarily external customer acquisition. It worked pretty well.

Over this period of time, you’ve got a lot more dollars pouring in online. At the same time, you’ve got this massive fluctuation in channels. The things that work today are probably not going to work in another month or two.

For GrowthHackers.com, for example, one of our big growth channels is Twitter. This weekend, there was an announcement that they’re going to start to introduce an algorithm to Twitter which is going to change that channel.

There’s probably going to be new opportunities that emerge from that, but there’s going to be new challenges, as well. Ultimately, growth has become a lot more challenging.

To be effective with it, you’ve got to start to bring your entire company into the growth process. Instead of just looking externally at driving growth, you want to look across all of the key levers of growth.

Dave McClure has what he calls pirate metrics, which is an AARRR framework that goes from acquisition, all the way down to activation, retention, revenue, referral. Improvement in any one of those areas is going to accelerate your growth trajectory. The challenge is that each of those areas generally fall within different groups within the company.

A lot of those are going to be product driven levers. They’re going to sit in the product team. Customer support’s going to have a lot of impact on activation. Sales, obviously, affects growth. Business development does. The challenge is coordinating those efforts across all of those teams. That’s where a growth team comes in who really helps to coordinate those efforts.quote1-ellis

To be successful, what you need, there’s really three key things that you need to be successful in driving growth across a full team. First, you need broad participation, which is not easy. I’ll go into a couple things to help with that.

Then you need a weekly cadence or maybe every other week cadence, but you need to have a pretty good cadence of testing. Then finally, you need transparent learning that you can communicate out to the rest of the company as you figure out what works and what doesn’t work. Let’s look at broad participation first.

This is not just within your company that you’re driving participation. This is actually, as I said, all of those levers are controlled by different people within the company. You want ideas from people who have unique insights from whether it’s the support team. Engineers are going to have some insights. I’ll show you an example in a little bit of a test idea that came from one of our engineers.

Basically, all your internal people, but then, externally, there’s going to be…you may not have a person on the team who really understands how to leverage Pinterest, but Pinterest might be a great growth channel for you. You want to be able to work with outsiders and bring them into the process so that as you explore some of these channels, you can at least use an expert to explore them.

If it works, then you can figure out, “Do I hire somebody? Do I continue to work with outsiders,” but you constantly want to get as much broad participation as you can. Your goal is to build a big backlog of growth ideas.

My team, actually, we just crossed 700 ideas in our backlog, untested ideas. I’ll talk about some of the benefits of having a big backlog in a minute. Ultimately, what you want to do is take each of those ideas rather than just have something that’s written on a scrap of paper.

You want actually turn it into an experiment dock that includes things like your hypothesis. Why do you think that idea is going to work? Do you have supporting data that says that that hypothesis is likely to come true, or is it a total guess? Sometimes a total guess actually is a good thing and works out well.

Target lever, where is it focused? Is it a specific channel? Is it deeper within the product?

Then you want to be able to score each idea. We use something that we call an ICE Score. Some people use a pie score. Ultimately for us, the ICE score is looking at it across a few different dimensions. Impact, if this thing works, is it going to be really high impact or is it something that’s just going to move the needle a little bit?

Confidence, do I have a lot of data supporting this, and does it give me a high level of confidence? Ease, is this something that’s relatively easy to test or is it going on take weeks of engineering just to figure out if it works? The ideal experiment is going to be something that’s scores really well across all of those.

Scoring on a scale of 1 to 10, 10 being something that’s really good. If it’s 10 impact, 10 confidence and 10 ease, that’s probably a really good test to run. This is just an example as you start to think about what’s the realm of ideas out there. You can find inspiration anywhere. My time at Dropbox, I think when I was there, we only had one thing.

When you look at your computer, there’s a little Dropbox icon up in the top of the computer, if you have Dropbox, most of you probably do. When you hover over that, originally it was only giving you a shortcut to your recently saved files, but somebody had the good idea of making a share link when you hovered over it.

There’s a whole on boarding with optimization that was able to happen through that share link. Then over time, someone else said, “This is a great place to merchandise different ways to drive retention or additional monetization.” Testing a message to get people to start saving their photos to Dropbox.

That’s going to get them better lock in, and they’re going to hit the limit faster, and need to upgrade. Then there’s just the pure upgrade button. Over time they’ve added a lot of things. As you look across the Web, you’ll start to see other tests that people are running. LinkedIn has a ton of them, for example.

Just across these different sites, you’ll start to get good ideas and some of them are going to be relevant for your business as well. Once you have a big backlog of ideas, then you want to look at running a weekly cadence of testing around those ideas. That’s where your growth team comes in. Growth team should start with a growth master.

A lot of times people think a growth team, if you don’t already have one, then you think, “OK,I got to go out and recruit and hire a growth team.” Maybe plan about six months to get them together and then we’ll be able to hit the ground, and really start doing a lot of testing, but that’s going to stall a lot of what you could be doing starting right away.

At the very least you should start with a growth master. Somebody, sometimes called the product manager of growth or head of growth, but someone who’s coordinating the efforts. Then make the rest of the team ad hoc. People on the ad hoc team might include some designers, some engineers or an engineer, analysts.

Then, at my company, we also have a lot of the execs. Head of product shows up through our weekly growth meeting, head of engineering shows up to our weekly growth meeting, I show up to the weekly growth meeting. The purpose of the team is to really manage the testing and learning that happens.

Here’s just a quick look at what our weekly growth agenda looks like. We start out with some of the high level. What are the growth metrics? Where do we see issues? Where do we see opportunities for growth? Review the last week’s testing sprint.

This is where the ad hoc piece comes in nicely that if you’re finding that you had a bunch of things you planned to launch, but you couldn’t get design resources, it’s probably a good argument to go and have a dedicated growth designer.

You couldn’t get engineering resources, get a dedicated growth engineer and then just continue to be responsive. A key part of the meeting is anything that you’ve tested and its run long enough, you can now start to analyze the results. Capture that learning and talk about that learning. That learning is going to be really important for informing future tests.

This is also when you plan out the test that you’re to be doing the next week. My team, we have everyone nominate a couple of ideas. They pitch the ideas on a one minute pitch to the rest of the team and the we shortlist it down to three to five ideas that we’re going to test each week. Then finally we check in, make sure that the backlog is continuing to grow.quote2-ellis1

The goal of the growth team is to test at a high tempo. You’re testing across each of the vectors that we’ve talked about that AARRR framework, running those tests. The more testing you do, the more learning you’re going to get. That’s all about that learning to figure out what’s going to work and what’s not going to work in the business.

A good example of that is Twitter. We all know that Twitter’s had some growth problems recently. That’s probably part of the reason why they’re open to the idea of adding an algorithm to it. It’s not actually the first time that Twitter had growth challenges.

In late 2010, they had almost a flat quarter in late 2010. At that time, they were running less than one test per week and brought in a new head of growth, who pushed to get that testing frequency up. They got it to 10 tests per week. You can see they had years of consistent growth following, getting it up to those 10 tests per week. It really does come down to testing. You obviously want to be smart about what you’re testing.

We talked about the ICE Score earlier. You can use that to start to prioritize, to figure out what are those things that score well across impact, confidence, and ease. Some of the areas where you see opportunities to grow, you can drill into those areas. As you’re nominating the test to run that week, try to pick some that are scoring well. Then manage the testing.

Really important, appoint a product manager or a project manager to each test. If it doesn’t get launched, you get somebody to that, to be able to describe why it didn’t get launched. It could be anyone from the growth team. Ideally, you’re going to want to have just somebody who actually that nominated that test or came up with it in the first place, because they will be at least passionate to see it go live and really want to see it work.

Then you want to run the test for enough time to actually see if they worked. You want to follow the testing best practices where you can find a lot of stuff online about that. Don’t run it from a Monday to a Wednesday and compare it to something that ran from a Thursday to a Saturday. You want to run for full weeks. Get a sufficient sample size, but there’s a lot that’s written about that so I won’t go into the details.quote3-ellis

Finally, analyze the results. Always ask why. If a test did not perform as expected, don’t just accept that. Figure out why. Did you have a clear hypothesis? If it didn’t deliver on that hypothesis, try and understand what happened. If you can understand what happened, you can generally inform much better the test that you’re going to do after that.

Once you find stuff that works, you want to make sure that you actually start to act on that information. Get those as part of your growth engine. If it means hiring someone to manage that channel, if you can automate that channel, whatever it is, you just want to have it in the mix and continue to monitor how effectively it’s working. Sometimes you need to run it as a separate test again a few months later to see if it’s still working because again, those channels are in flux.

Then the last piece, transparent learning. You’ve run the analysis. You’ve run the testing. If you don’t share that learning externally with the rest of the team, you’re going to have a really hard time getting the rest of the team to submit ideas because they don’t know what’s been tested and what’s not been tested.

It’s not just about sharing the wins, but it’s also sharing the things that didn’t work. A lot of times, the things that didn’t work are what lead to the big breakthrough ideas that end up working really well, so you got to communicate that and make that available to everybody.

Here’s as an example of a test that we ran on our team based on sharing some information externally. This is really data that showed us that when we had media embedded on the site, that led to longer sessions. It’s pretty intuitive. You could’ve figured that out.

Essentially, GrowthHackers is linking off often to resources that will help you grow, but there’s a lot of great YouTube videos and SlideShares that there’s no reason to link off to SlideShare or YouTube. Those properties are designed for media that can be embedded to where we can then recommend other content that’s related to that, host discussion around it, take it to the next level.

Our challenge was that we had to actually do that through our admin, set that information up in there. We missed a lot of them, but when we presented it to one of the engineers, he said, “Oh, there’s this thing called oEmbed.”

“With oEmbed, we can automate that so as soon as our system detects it’s a YouTube URL or a SlideShare URL, it’ll automatically pull that media in.” That led to 41 percent more embeds, which led to longer sessions. Again, ideas can come from anywhere within the organization.

This is the last point that I have, and then we’ll have some conversation. It’s really easy to get super process oriented or super numbers oriented and lose track of the purpose for what you’re doing as you try to build a growth culture.

Ultimately, what you’re trying to do is that…the last session was all about product market fit. Product market fit is about creating value for people. Once you’ve created that value, you need to deliver that value. You’ve got to find the right people for that and get the right people experiencing the product in the right way.

As you obsess on that, and as you get the rest of the team to obsess on that, it’s a lot easier for them to want to participate in the process. They’re not going to look to at it as, “That’s just like spamming marketing stuff,” but instead it’s about really figuring our how do we connect to the right people?

How do we clear away the friction that prevents them from having a great experience with the product? How do we set the right expectations and drive momentum and bring them in? That obsession on value will get a lot more people interested and actively using the product and get the rest of your team actively participating in the process.

That’s it for some slides. We’ll have some conversation around it.

Alison:  Great.

Thank you. That was an outstanding framework to think about growth hacking. When I was reading about you, it looks like you were one of the people who coined the growth hacking term.

Sean:  With Heaton, actually.

Alison:  With Heaton? Ok, I didn’t realize this is all coming full circle here. I want to dig into some of the more tactical questions around this. Is there a certain size that a company needs to be before you layer this team in?

Sean:  What’s interesting when I was at Dropbox, we were like 8 or 10 people. At that point, again, it’s less about the team or an external team being layered on there and more about everybody at that point was part of that growth team. By having the early people see some experiments that we’re running, once they saw some wins from those experiments, it got contagious.

Everybody started quantifying code a lot more and quantifying features. It relates to Lean startup type things, but it also is just…It gets addictive once you start to see those numbers. Early on, you may not need a big team, but this ad hoc approach can work with a pretty small team.

Alison:  A pretty small team, right. Is there a certain budget that was allocated or does some of the companies herein have budget allocated to run these experiments, or was the budget coming from each of those functional areas or elsewhere?quote4-ellis

Sean:  Most of the stuff that ended up working for Dropbox, for example, was not dollar driven. We tested a little bit around AdWords and some things, but, ultimately, the stuff that worked really well were just natural sharing in the product and incentivized referral program where we gave away free space, which wasn’t free for us.

Ultimately, there is cost associated with it, but it wasn’t necessarily budget. It was performance based. If it didn’t work, you didn’t do it.

Alison:  Makes sense. It looks like one of the key elements, when the fact that you have 500 inputs, you must have some kind of system that everyone’s, tactically, where people putting their ideas. How are they getting put in there? How does everybody even know in a company to submit ideas?

Sean:  The systems are evolving. Probably the majority of companies today are still using Spreadsheets, Google Docs, and kind of the hodgepodge of things. Trello is getting better for it. My team is actually trying to build a system. Some of the things that you saw there is a private data product that we’re working on.

Alison:  I couldn’t recognize what that product was. Well then, you can help sell your product now.

[laughter]

Alison:  What are some times that things haven’t worked? I imagine that you’ve tried not just experiments but culturally in a company, there’s probably some aversion to this until you spread the gospel a little bit. Walk us through some times where that hasn’t worked as well in what you’ve done to overcome any opposition.

Sean:  One of the things that tends to throw off growth testing is big projects. In my own company, we started Super MVP on a WordPress hosted version of GrowthHackers. We rebuilt it all in Ruby. Just suddenly like, “Oh, we won’t do testing until we get this thing back out.”

Then there was a redesign and like, “Oh, we won’t do testing until we get this back out.” It’s really easy to break that rhythm. Every company I talk to…it’s when the rhythm breaks, it’s hard to start it back up. Everybody has big product initiatives, so they start pulling the growth engineers into big product initiatives.

Your marketing and growth tends to be a proactive thing rather than a reactive thing. It’s easy to take something away from proactive, but at the end of the day, if you’re not growing, you’re not building a valuable company. It takes a lot of discipline to continue to invest and drive the process.

Alison:  If the CEO is not a champion, can this still happen, growth hacking? Does it really have to come from the very top?

Sean:  It has to come from the very top. The interesting thing that I’ve found is that the CEOs that are really good participants in the process, they don’t want to just throw out ideas because they know their ideas may not be as good as other people’s.

If there’s not a good way of scoring them, like the ICE Scoring system, it’s just basically, “The CEO said we should test it. Drop everything and go test this.”

By having a systematic process of nominations and being able to hone in on what are the best opportunities based on what the data says, it’s a way for the CEO to be able to participate and not dominate the process, but be an active participant in that.

Alison:  You talked about the concept of a growth master, that you need somebody who’s running your meetings. Those people here that want to set up a growth master in their company, what are some of the characteristics of that person? What function might they be in? How long should they been at the company? What’s been successful in your experience?quote5-ellis

Sean:  The most successful ones tend to come out of product organizations, tend to have a lot of really good project management experience. They need to lead the growth meeting so they’ve got to have some authority a little bit or some level of being able to manage that.

A lot of them have pretty good analytics background. Ours is a woman that we hired out of bigger company, Broadcom. She was used to navigating around challenges in a bigger organization. She’s been really good for managing the process with us. The biggest thing is someone who’s dedicated to the process and just as going to be relentless about keeping it on track.

Alison:  That makes sense. I want to make sure that we get enough time to get some questions from the group. Anybody have any questions right now? I’ll go right ahead there.

Audience Member:  Hi. My question centers around what type of experiments actually to run at the moment. I’m struggling, we have a growth team, but I’m struggling on the experiments we do with paid acquisition channels, the organic acquisition channels versus the experiments we do in product with the flow.

Have you got any percentages? I don’t want to create a culture where we’re just over experimenting, to be honest with you. Have you got any advice on that?

Sean:  In picking the area to experiment within, it does change company by company. If you’ve done almost all external testing, then you probably have a lot of low hanging fruit around retention and activation. The challenge with retention and activation experiments is that generally, you can only do one at a time or you start creating too much noise around the data.

Where acquisition experiments, you can run 20, 30 on top of each other. It’s not going to mess things up. Acquisition experiments, you can often see your ROI much more quickly. You don’t need to be quite as deliberate about running between these days and that sort of thing.quote6-ellis

Also, if you end it early, you can model out your return on investment, but then you can go back and check it maybe a month later and see, once those people in a SaaS product, if you’ve got a long sales conversion cycle, did we get the ROI that we thought we were going to get based on the early models of getting people to a certain point in a trial, say three months later, did those people convert at a value that made that a cost effective channel for us?

Alison:  Right over there? Oh, yeah. We’ll go back to you.

Sean:  OK, sorry. [laughs]

Jordan H. Frank:  Hi, there. Jordan H. Frank from Global Learning. I wanted to ask a question about your framework for prioritizing your tests. I see that most of it focused on upside. What are we going to get out of this test? How will it benefit our users? How do you assess risk and downside? What’s the worst that could happen with this sort of test?

Sean:  That’s a really good question. Essentially, it’s not all about the upside particularly when it comes to pop ups and things that can start to create a bit of a spammy experience. That’s why it’s really good to have your VP of Product in the meetings. Have your CEO in the meetings.

Your growth master should be more aggressive than your Head of Product, than your CEO. There should be a little bit of friction there and a time where the people say, “Hey, that’s crossing a line.”

Part of that is in the analysis then, saying, “I think the risk here is that people aren’t going to be coming back. Maybe we get a much higher registration rate, but we’re ruining that first user experience, as a result. The downside is I don’t think they’re going to be coming back at a good rate.”

The data will tell you that. That’s something that we specifically, in my company, looked at recently was an experiment that was more about a wall bringing people in but the calling, the success based on what is the six week retention cohort look like on people who hit that on their first experience.

Alison:  All right. We can go back over…Yeah. [laughs]

Audience Member:  Hi, Sean. Big fan of what you and Anuj Adhiya do at GrowthHackers.

Sean:  Thank you.

Audience Member: Quick question, my forte is top of the funnel, getting people to the site, getting people who are qualified, potentially going to convert. Where have you seen some of your biggest wins come? Is it top of the funnel, middle of the funnel as you’re warming them up, or is it closer to getting them to conversion in terms of ROI? Where have you and your tests…?

[crosstalk]

Sean:  I’m really glad you asked that question because it is interdependent. I’ll give you a specific example from LogMeIn. Early on, in the early days, were able to scale cost effectively to about $10,000 a month in spending. We got them to the site, we got them to sign up, but we had so much friction in actually getting them to do a remote control session and start using the free product.

If they never use it, there’s going to be no conversion to premium. We pretty quickly said, “We’re going to hit a wall on acquisition unless we figure out the deeper funnel challenges.” We hit the pause button on customer acquisition, spent four months of documenting the funnel, testing like crazy everywhere within the funnel and like qualitative feedback.

When people abandon at this step, why did they abandon? Use that information to drive a bunch of experiments. We got about a thousand percent increase in the percentage of people who signed up and actually used the product in a four month period. What that did externally to channels, is that it took the same set of channels that previously scaled to $10,000.

When we went back and tested everything again, it scaled to a million dollars with a three month payback on marketing investments. There’s a ton of interdependence on that.

Alison:  Let’s hope everybody else can enjoy that same kind of success. Awesome.

Sean:  Yeah.

Alison:  Any other questions? Are there any, I mean, given that we are in the Tactical Stage, specific tools, features, products that you’ve been leveraging within your website or others that you’ve relied on?

Sean:  Yeah, particularly around the testing itself, it used to be, back when I was at LogMeIn, what took four months to do. A lot of that was pretty challenging, launching each test. With something like Optimizely, where we relied on a team of three or four people at LogMeIn, I can do it myself now with Optimizely. Unbounce, the same thing. Page test can be much easier. Usually, those aren’t necessarily analytics products.

They’re testing facilitation products, if you think that way, with light analytics, but then being able to integrate into something like a Kissmetrics or Mixpanel that’s more of an analytics specialist, where you can get much better retention cohorts tied to that experiment over time.

Yeah, there’s a whole stack. The testing and the analytics are the two big areas and then all the stuff around email as well. MailChimp for a lot of A/B testing. Then even the marketing automation for being able to then automate and manage programs once you figure out what works and what doesn’t work.

Alison:  Maybe as a final question, for those people here who are going to institute this into their own company, what is an expected period of time that it might take to get it up and going if you want to identify the growth master to actually start seeing some results?

Sean:  What I like to do, if you’re just getting started with it, I like to script my first 10 experiments basically so that if you’re deliberate enough about what the first 10 experiments are, the likelihood that you get enough wins, that you get organizational buy-in, get the CEO and everybody else saying that it was worth the effort, makes sense.quote7-ellis

Going in and doing the user testing and doing the feedback, when people abandon at different points within your funnel and finding out how do they discover a product like yours, just getting some of that data upfront to be really deliberate it about your first 10 tests. Then based on those first 10 tests, you’ll find that one of them is going to way outperform the others.

That’s going to be a vein of gold in which you want to keep doubling down and doing a lot of other tests like that. I can usually script and run the first 10 tests within 30 days for a business.

Alison:  That’s excellent. We are out of time. Thank you so much, Sean. A round of applause for…

[View the slides here.]

 

Related Posts

Pin It on Pinterest

Share This