Amazon’s cloud unit said Thursday that it’s allocating $100 million for a center to help companies use generative artificial intelligence, the technology that’s taken off in the months since OpenAI unleashed its ChatGPT chatbot on the public.
It’s a small investment for a company with $64 billion in cash and half a trillion dollars a year in operating expenses. But the announcement shows that Amazon Web Services recognizes the significance of the current moment in generative AI and the importance of being in the conversation, alongside rivals Microsoft and Google.
“You ask yourself the question — where are the different runners three steps into a 10K race?” AWS CEO Adam Selipsky said in an interview this week with CNBC. “Does it really matter? The point is, you’re three steps in, and it’s a 10K race.”
As part of the latest announcement, Amazon said it will be adding some data scientists, engineers and solutions architects to the payroll. AWS said the center is already working with Highspot, Twilio, RyanAir and Lonely Planet. The company told CNBC that it’s a “program” rather than a physical center.
Amazon, which beat Microsoft and Google to the business of renting out servers and data storage to companies and other organizations, enjoys a commanding lead in the cloud infrastructure market. However, those rivals have had splashier entrances into generative AI, even though Amazon has drawn broadly on AI for years to show shopping recommendations and operate its Alexa voice assistant.
Microsoft has been spending billions on a multilayered alliance with OpenAI, and Google is moving quickly to deploy AI tools it’s built in-house for consumers and businesses.
Nor does Amazon have the first popular large language model that can enable a chatbot or a tool for summarizing documents.
Selipsky said he isn’t concerned. He joined the company in 2005, a year before the launch of AWS’ core services for computing and storage. Echoing Amazon founder and longtime CEO Jeff Bezos, Selipsky said the company has succeeded by listening to customers.
“Amazon has had many examples in its history where it said, we’re going to focus on customers and have steadfast belief that we’re going to work with customers, we’re going to build what they want,” Selipsky said. “And if people want to perceive us in a certain way, we’re misunderstood, that’s OK, as long as customers understand where we’re going.”
One challenge Amazon currently faces is in meeting demand for AI chips. The company chose to start building chips to supplement graphics processing units from Nvidia, the leader in the space. Both companies are racing to get more supply on the market.
“I think the whole world has a shortage in the short term of compute capacity for doing generative AI and machine learning in general right now,” Selipsky said. People are impatient, and the situation will improve in the next few months, he added.
Selipsky is also reckoning with a slowdown in customer spending on cloud, as businesses prepare for ongoing economic uncertainty.
“A lot of customers are largely through their cost optimization, but there have been other customers who are still right in the middle of it,” he said. “It’s hard to predict exactly when that particular trend will be over. But we’re still in the middle of it.”
Still, the AI trend is real, he insists. For Amazon, that momentum applies to its Bedrock generative AI service and its Titan models as well as the new innovation center.
“AI is going to be this next wave of innovation in the cloud,” he said. “It’s going to be the next big thing that pushes even more customers to want to be in the cloud. Really, you need the cloud for generative AI.”
Also, the way Selipsky sees it, AWS provides a measure of credibility in offering generative AI that eludes others in the space.
“I can’t tell you how many Fortune 500 companies I’ve talked to who banned ChatGPT in the enterprise,” Selipsky said. “Because at least the initial versions of it just didn’t have that concept of enterprise security.” CNBC