Should we care about AI infrastructure when building SaaS applications? The Head of the Global VC Practice at Oracle, J.D. Weinstein, and Managing Partner at Bain Capital Ventures, Aaref Hilaly, take the stage at SaaStr Annual to answer this question, plus what to expect with the future of AI.

What Is AI Infrastructure

Infrastructure is everything you might think of behind a ChatGPT-like experience. It starts with silicon chips, GPU, and data centers. Then, there are models like GPT4 and Claude from Anthropic. After that, you have infrastructure around the models, helping you pick the right models or managing the data to be fed into the models. And finally, you build developer tools on top of the model.

At the application layer, things have evolved well. How?

  1. There isn’t just one model. There are multiple, so you have a choice and can choose between them.
  2. Models built by OpenAI, Google, and Anthropic have billions of dollars to invest in training these models, so you have more powerful engines under the hood at no cost to you.
  3. You do pay however pay for inference. Every time you use it, there’s electricity, GPU, and hard-dollar costs.
  4. The price for inference has massively plummeted this year, so you have more powerful models to build into your application, and they cost less every time you use them.

We’re Very Early in the AI Cycle

We’re still in the very early stages of this AI platform shift. In previous cycles, like when the internet came about, infrastructure buildout typically preceded applications. For example, in the 90s, Cisco sold routers and switches like hotcakes as the internet backbone and infrastructure were being built out.

It wasn’t until years later that Workday and Salesforce and a whole generation of SaaS companies came along to build on top of that infrastructure. Additionally, if you look at the mobile shift, the iPhone was released in 2007 but we didn’t get our first mobile apps like Uber and Snapchat until 2009 and 2010.

We’re in a similar stage to that now, where there’s this massive level of expenditure on everything from data centers to training models but we have not yet seen anywhere near the expected penetration that we will have on the application level. We’re still inside a 2-3 year pocket where infrastructure is being built out still, so we haven’t seen yet how we’ll fully build an application layer that will leverage this new AI infrastructure.

Infrastructure Buildout Isn’t Trivial

People are building novel and interesting applications, essentially playing with the technology and getting used to it. Let’s step back and look at how meaningful this compute buildout is.

Previously, we had enough data centers to power a lot of CPU computing needs. Suddenly, we reached a major friction point with chipsets in the supply chain. Today, it’s all about having enough raw physical power to power artificial intelligence.

Head of the Global VC Practice at Oracle, J.D. Weinstein explains: “To give you a general rule of thumb, these AI infrastructure data centers are about 200 or 300 megawatts. One megawatt powers about a thousand homes and a good rule of thumb is it’s about 3 million to build out each megawatt. So you can do the math on how serious that is that we’ll be scaling up now to gigawatt data centers. Oracle just announced publicly that we’re actually working in Europe, where we have secured some permits, to use three small nuclear reactors to power a gigawatt data center across the pond.”

To take advantage of all the greatness of AI and its possibilities, we still need the power to achieve it.

Ways to Build AI Into Your Business

The natural inclination is to tack AI onto existing products. Think of Notion. They now have an assistant to draft and write tasks and notes. Everyone will look to do something similar, and to some extent, that’s useful, but what would an AI native experience look like within your domain expertise?

AI products will ultimately look and feel radically different from products today. Think of the concept of a menu on a website. You click through to find what you’re looking for. Menus might be eliminated because an AI-powered application has the intelligence to know what you’re trying to do and then take you there.

Pricing Will Move Away From Per Seat

You can’t just add AI into your product for free because there’s a real dollar cost to add it to your existing offering. You have to pay for inference, so per-seat pricing may not make sense in this context anymore because it replaces a seat.

On the one hand, because there’s no precedent or “standard” when it comes to pricing for AI right now, it will inevitably change. But you also can’t give something away that will crush your margins completely. So, some kind of separate charge for AI products that require inference is needed.

What Are We Most Underprepared For With AI

“I don’t know if everyone realizes how much better these models are going to get over the next couple of years,” Aaref shares. Five years ago, there was nothing. Today, things are relatively good. It will be an equally big jump a couple of years from now.

AI is good at summarizing information right now and is on the cusp of reasoning. In the next couple of years, AI might be able to break down tasks, make plans, and then execute on them for you. Software will move from helping humans be more productive to doing things for them entirely.

To learn more about Oracle’s AI Infrastructure, click here next.

Related Posts

Pin It on Pinterest

Share This