AI is touted as the greatest thing since the invention of the wheel, but you can be forgiven if you don’t have a clue as to what it means or what to do with it. After all, the frenzied pace of AI-related news is dizzying, making it hard to filter signal from noise.
Every day sees a new large language model (LLM) released, some from companies (e.g., Moonshot AI) raising amounts that seem unhinged from reality (e.g., $1 billion). Every day a different LLM leapfrogs incumbents on performance or functionality. A few weeks ago it was Meta, but last week it was Google’s Gemini dunking on ChatGPT. Even completely non-AI related things (like power chargers!!!) are getting AI labels slapped on them.
And yet, the reality is that most enterprises still aren’t doing meaningful things with AI.
That is not to say they won’t. But a big problem for AI is its torrid pace of innovation. It’s hard for even the savviest of observers to keep up with AI right now. I spoke to an experienced data scientist last week and asked her how she makes sense from all the AI noise. Answer? She doesn’t. Or can’t.
What should you do? To get grounded in our AI future, it’s worth looking back at how top companies made sense of the cloud, and, in particular, how AWS helped make it happen.
Cloud is key
The first step toward grokking AI is cloud because it allows you to tiptoe your way in (if you wish). Years ago, then AWS data science chief Matt Wood told me that the key to taming big data (the term we used before data science, which was the term we used before AI) was to tap into elastic infrastructure. As he put it, “Those that go out and buy expensive infrastructure find that the problem scope and domain shift really quickly. By the time they get around to answering the original question, the business has moved on.”
Sure, you’ll hear from people like 37Signal’s co-founder David Heinemeier Hansson, who likes to criticize the cloud as expensive. This is nonsense. Cloud repatriation might work for a slow-growing company like 37Signals with very predictable workloads, but it’s the absolute wrong strategy for a company where demand isn’t predictable, which is almost the dictionary definition of any AI-related workload. There’s nothing more expensive than infrastructure that constrains your ability to meet customer demand.
Back to Wood: “You need an environment that is flexible and allows you to quickly respond to changing big data requirements.” Again, this is particularly true for AI, where most workloads will be experimental in nature. According to Wood, “Your resource mix is continually evolving—if you buy infrastructure, it’s almost immediately irrelevant to your business because it’s frozen in time. It’s solving a problem you may not have or care about anymore.”
Again, the key to getting started with AI is to ensure you’re building with cloud, as it will enable the requisite flexibility to experiment your way toward success.
What comes next?
Cloud’s elastic infrastructure enables companies to place big bets without breaking the bank. As then AWS CEO (and current Amazon CEO) Andy Jassy noted in a 2019 interview, the companies that have the most success with cloud are those that “flip the switch” and go big, not incremental, in their approach. Translated to our AI era, the point is not to think small but rather to “take risks on new business ideas because the cost of trying a bunch of different iterations of it is so much lower…in the cloud,” as he suggests.
It’s fair to counter that AI is overhyped, but Jassy would likely still argue (as he did in the interview) that the cost of playing it conservative is to be displaced by a more nimble, AI-driven startup. As he says, “[Enterprises] have to think about what do their customers want and what’s the customer experience that’s going to be the one that’s demanded over time. And, usually, that requires a pretty big change or transformation.” This is certainly the case with AI.
Again, cloud enables enterprises to make big bets in an incremental way.
This brings us to the question of who should drive those big-but-incremental bets. For years developers were the locus of power, rapidly innovating with open source software and cloud infrastructure. That’s still true, but they need help, and that help needs to come from the CEO, Jassy stressed. “Most of the big initial challenges of transforming the cloud are not technical,” he says, but rather “about leadership—executive leadership.” Developers are amazing at figuring out how to get things done, but having a mandate from the CEO gives them license to innovate.
Make it easy for me
What about vendors? It strikes me that the big winner in AI will not be the company that creates the most sophisticated LLM or develops the most feature-rich vector database. No, it will be the company that makes it easiest to use AI.
This isn’t new. The big winner in cloud was AWS, because it made it easier for enterprises to use cloud services. The big winner early on in open source/Linux was Red Hat, because it removed the complexity associated with running Linux. Google wasn’t first to develop search capabilities, but it was first to remove the bother associated with it. GitHub wasn’t first to give developers a way to store and share code, but it was first to make it work for developers at scale. Etc.
We need this for AI. Yes, enterprises can feel their way to AI success through cloudy experimentation, but the big winner in AI is probably not going to be OpenAI or whoever is creating yet another LLM. My money is on the company that makes it simple for other companies to use AI productively. Game on.
Copyright © 2024 IDG Communications, Inc.