GitHub, founded in 2008, is a leading platform for software development and version control that has made waves since 2018 with its AI Copilot. Microsoft acquired GitHub also in 2018 and it has since grown to over 100 million developers and more than 420 million repositories, including at least 28 million public repositories, making it the world’s largest source code host.
At this year’s SaaStr AI Summit, GitHub CRO Elizabeth Pemmerl shared how to bring AI products to market at scale successfully. With 90% of the Fortune 100 on GitHub and GitHub Copilot accounting for over 40% of GitHub’s revenue growth this year, these real-world examples will also help you launch an AI-powered product at scale.
Be Customer Zero

The GitHub Copilot extension is an AI pair programmer tool that helps programmers write code faster and smarter. Copilot is a code completion tool so developers utilize it by starting to type code, and the copilot offers the rest of that coding suggestion based on the prompt. Since its launch less than 5 years ago, Copilot has grown to 77,000 organizations using and almost 2 million users.
So what does Elizabeth mean by being customer zero? Copilot started from an internal interest in Github to help its developers be more productive, to be happier, and to stay in their flow state longer. First and foremost they wanted to build a product they wanted to use internally. As Github’s individual developers were loving it, the broader team realized there was a SKU and opportunity here to launch a new product.
Once they brought it to market, they knew they needed better user admin features, onboarding capabilities, SSO integration, etc.
Elizabeth explained: “We continued doggedly poking at this problem of where else can Copilot help developers be more creative and more satisfied, and so we integrated Copilot then across the command line, across our pull request features, across issues and documentation, and that became the basis for CoPilot Enterprise.”
Another point of differentiation for Github and Elizabeth was to stay very focused on the problems they were trying to solve. That discipline became incredibly important as they thought about scaling AI tools because of the number of use cases they could tackle and the ubiquity of AI now solving all the problems. Github stayed focused on the problems they wanted to solve and didn’t get distracted.
Create Strong Feedback Listening Mechanisms

You need a strong feedback listening mechanism when you have a high volume of feedback in a dynamic space like AI. GitHub focused on three major components for its listening system.
- Ensuring the process was as seamless and friction-free as possible.
- Keeping the door as wide as possible to get feedback.
- Be disciplined about prioritizing that feedback and communication.
Let’s touch on the process component. Historically, the burden of customer feedback fell on the solutions engineers and CS architects. They would get a long issue template and need to fill in 47 fields containing what the customer said. That only captured about 30% of the feedback.
GitHub blew that apart and created a simple chat command. Now, it’s seamlessly in their natural workflow. From there, it’s triaged in a central project board, where they can tag, aggregate, and add value weekly for better decision-making. Automating feedback and insights provides tremendous velocity and time back.
A New GTM Mental Model for AI Products
That feedback told GitHub that people who weren’t using the GitHub platform still wanted to use Copilot for Business. GitHub had to decide whether to offer this to non-GitHub users or not. They decided to move in this new direction where anyone could use it, and it created a mental model for thinking about AI products.
First, they had to decide whether this code completion functionality capability was part of the native platform or if it could be an independent SKU. They took it as its own SKU that could be opened up into new and adjacent markets. They ended up shipping Copilot for Business for non-GitHub Enterprise users, which was a commercial success.

Other places this feedback loop worked well were:
- Adoption loop metrics. Customers crave metrics about how teams are using all AI products.
- Deciding where you don’t want to spend your time.
- Figuring out how partners could fill the gaps based on customer feedback.
Adjusting Your Assumptions

Now you have strong product market fit, you’re listening to your customers, and you’re customer zero. You have to listen on the product front and the GTM front. Eighteen months ago, no playbook existed for taking an AI product to market.
GitHub assumed out of the gate that junior developers would love copilot because it would make early talent more productive quickly. But the reality was that there was far more excitement about giving that time back to senior developers who were able to craft more sophisticated prompting and focus on even tougher problems for the company when they weren’t caught up in rote administrative tasks.
Sometimes, you have to change what you thought the initial value proposition was. GitHub has two core company principles of seven that applied here.
- Ship to learn
- Embrace a growth mindset
Their core ethos is being iterative and a little uncomfortable. You want people who have stuck with it when the year was lean or went through a messy acquisition and came out on the other side. You need to be able to react quickly to what’s happening in the market and make adjustments as you go.
Other Learnings in the GTM Journey

GitHub has had to adjust to new sales stages and new buyer personas. There was a proliferation of conversations centered around legal, privacy, and models that took much longer in these sales stages than ever experienced.
You might have to take a new approach to feedback to keep up with demand. For example, you might send your legal team on a road show, hosting round tables in major cities worldwide. Have frank conversations with people about what AI is, what it isn’t, and what you should and shouldn’t be concerned about.
New buyer personas are emerging, like AI Transformation Officers or Chief AI Officers, so you need to keep an eye on where role responsibilities and budgets lie over the next couple of years.
Relentlessly Quantify Value
”I’ve never seen from customers such a voracious appetite to ensure ROI from a product,” Elizabeth says. They want to know what’s happening from day one. This is hard to do well, and you need data from the product and context from the customer to understand what they really care about, which is different for everyone.
Then, package that up into a narrative that makes sense. Don’t underestimate the investments you need to make here to be compelling to your customers.
Just as much as customers want to know ROI, they also want to tinker with Copilot for their use case and team. This is exciting but also taxing on your org, especially when you’re small.
You have to get opinionated about what POC is worth supporting and how you will measure the impact. Sometimes, you have to say no or push them into a specific package, which is true post-sale for adoption. Customers say they’ll let interested developers use it, but you really need to get everyone going.
Be prescriptive about the adoption journey.
Driving Growth at Scale

If you’re lucky, this massive business you’re building is showing signs of strain in certain areas because of rapid growth. The next horizon is scaling. Your greatest asset is the hours of experts on your teams.
How do you make your sellers as productive as possible when they can’t possibly keep up with the demand? You have to put as much as possible into self-service channels, enable partner networks to help, record things, and drive a one-to-many approach.
In times of massive scale, it’s an opportunity to get more opinionated about your pipeline because you have a lot coming in. You want to get a baseline to measure and attribute it and identify the levers to pull around it.
Another big investment during rapid growth is digital moments. You need customers to do everything themselves, on demand, without human intervention. You can brute force these things initially, but it doesn’t work at scale. It collapses under process and administrative technical debt created over the years.
Using AI to Handle Rapid Growth
Internal processes and experiences like support can get overloaded by scale. GitHub turned to Copilot to help with the support case queue. Based on its success, it’s now across many support surfaces, deflecting about 60% of tickets and saving tons of man-hours and hires.
The support team can focus on more complex problems. Copilot for support implementation has been a winner, followed by partner strategy. You might arrive at a point where you can no longer support the weight of your own GTM, like sales and post-sales adoption motions.
You need to access new channels and geos, and partners can come in behind you to ensure customers are adopting the product and their ARR is increasing year-over-year. You want to invest time and energy here so that partners are ready to talk to customers.
Double Down on Fundamentals

There’s a lot of noise when you’re scaling fast. How do you help ensure you’re driving signal through all this noise and change? GitHub took all the GTM motions and conversations across the company and organized them through the lens of three revenue plays.
- Land
- Secure
- Accelerate
The revenue plays become the basis for enablement and PMM playbooks and the core of your product roadmap, how finance reports on the business, running MBRs and QBRs, and measuring pipeline. The simple shift of naming these plays creates clarity and alignment around them end-to-end in the business.
Key Takeaways
- Keep customers at the center of the journey of scaling an AI product to millions of users.
- Be nimble in your GTM strategy and pivot based on feedback created by functional feedback loops between customers, revenue, partners, engineering, product, and design.
- Invest in scale engines to make sure you’re ready to get to the next million users of your product.
