The AI revolution is upon us, with companies worldwide eager to dive into the technology popularized by ChatGPT. To satisfy their generative AI needs, firms must secure cutting-edge software and increase computing power, leading them to the doors of global cloud computing giants, with Amazon’s AWS being the largest.
Amazon’s computing resources are housed in data centers scattered across the globe, and Prasad Kalyanaraman, Vice President of AWS Infrastructure, is the man responsible for keeping them operational. AWS data centers are spread across numerous regions, serving as the backbone of the online world, with Microsoft and Google as its closest competitors.
With the generative AI revolution accelerating, Kalyanaraman must ensure that the data center battalions are prepared for the challenge. “It takes a significant amount of grit and innovation” to meet the current computing demands, Kalyanaraman told AFP during an interview at Amazon’s second headquarters near Washington. “Building the right technology, both in terms of consuming the least amount of power needed and optimizing all the way from the chip level to the data center level… requires a lot of innovation,” he said.
Kalyanaraman, a graduate of the prestigious Indian Institute of Technology and Queen’s University in Canada, has been at Amazon for nearly two decades. He initially worked on software before taking charge of the data centers. “Most users, unbeknownst to them, are using cloud computing today. If you go to a website, stream video, or check your financial transactions, you’re actually using some form of cloud computing,” he explained.
Amazon’s decision to venture into the cloud dates back to 2006 when the company realized that its partners and sellers didn’t want to build or buy expensive computing infrastructure. “We saw that it’s so hard for our customers to go through all the muck of building this infrastructure. So why not bring this to them,” much like utilities bring power to your home, he said. Nearly two decades later, AWS is nearing 20 percent of Amazon’s total revenue and contributes about two-thirds of its total profit.
“It’s a pretty significant undertaking to actually construct a data center from scratch,” Kalyanaraman said. “First, obviously, we have to find enough land to deploy these data centers. Typically, we deploy further away from metropolitan locations” for both cost and environmental reasons, he added. Connectivity is also crucial since most clients want high computing speeds that come from being closer to their data. Then there has to be a power source and the power lines to get the electricity.
With success comes scrutiny, or in some communities across the globe, frustration with the proliferation of data centers. Data centers can disrupt an area’s scenic beauty and place a massive burden on the local power supply, straining already fragile electricity grids. With the emergence of generative AI, Amazon has announced new projects worldwide.
Kalyanaraman acknowledged that “power will be a constrained resource in the world today, especially with generative AI and some of the other things that are required to run this amount of compute.” However, “it’s not something that you can actually change overnight,” he said. AWS has worked with power companies to manage the flow, notably through renewable energy.
AWS “is the largest purchaser of renewable energy in the world today. And that’s for four years in a row,” he said, with AWS committed to being a net-zero carbon company by 2040. Ever the techno-optimist, Kalyanaraman remained confident that innovation could find a way to meet the generative AI challenge, with the industry looking to nuclear energy to help. “Every time we’ve actually had a constraint, we’ve all figured out a way of innovating.” “I see (AI) as an opportunity,” he said.
Source: AFP