News

The Rise of AI: A Driving Force Behind Data Center Expansion

by | May 1, 2024 | Energy & Sustainability, Industry Info

Artificial Intelligence (AI) has seen a meteoric rise in use and headlines. Behind all this buzz, the demand for data centers is also growing at an unprecedented rate. What role does AI play in this surge and why?

Data Center Market Growth

The boom in data center construction, according to McKinsey, is expected to be 10% through 2030. Some experts predict even higher growth – 20% annually for the data center hyperscale market. At the end of 2023, there were approximately 8000 data centers worldwide, and one-third are in the United States. Before exploring the reasons behind the rampant growth projections (note there has been significant growth in recent years as well), let’s consider the purpose of a data center. Data centers are essentially storage centers for servers, routers, and other IT infrastructure. The equipment stores and processes data. A data center may be part of a building or an independent structure.  

AI’s Role in the Data Center Market

 

Experts point to AI as a driving factor behind the need for more data centers. Still, there was a need for more data centers before AI became the elephant in the room (AI became a thing in the 1950s).  

“Supply was already struggling to keep pace with surging demand for data centre space in major metropolitan areas, prior to the mass adoption of AI tools,” according to Schroder Investment Management. “Supply chain bottlenecks and a lack of distributable power have led to low vacancy rates and significant pricing power for data centre owners in these locations.” Therefore, the data center market was primed for growth even without the AI boom. Yet, growth in data center building and upgrading is related to AI.

In 2022, reports the International Energy Agency (IEA), data centers accounted for approximately 460 terawatt-hours (TWh) in 2022, or two percent of worldwide electricity. The IEA projects data centers consume over 1,000 TWh of electricity in 2026. For context, consider this from the IEA, “This demand is roughly equivalent to the electricity consumption of Japan.”

Some of the more common AI tools driving the need for data are ChatGPT and Google’s Bard. These hugely popular AI tools use Learning Language Models.

Let’s turn to Amazon Web Service to explain LLMs, which are very big.

Large language models (LLM) are very large deep learning models that are pre-trained on vast amounts of data. The underlying transformer is a set of neural networks that consist of an encoder and a decoder with self-attention capabilities. The encoder and decoder extract meanings from a sequence of text and understand the relationships between words and phrases in it. … Large language models … can perform completely different tasks such as answering questions, summarizing documents, translating languages, and completing sentences. 

Note: All AI and Generative AI tools use LLMs.

AI is Power-Hungry.

AI is used for many, many things. A quick Google search on ‘AI uses’ can confirm that the technology is ubiquitous. Nonetheless, the question that still needs answering is, ‘Why does AI require more data centers?’ AI requires massive amounts of data and training to produce as needed. The volume of data is changing the requirements of data centers. In the past, “Data centers … [were] built around CPU-powered racks to tackle traditional computing workloads,” according to Data Center Dynamics. “However, AI compute instead requires GPU-powered racks, which consume more power, emit more heat, and occupy more space than an equivalent CPU capacity.”

The changes due to AI are impacting the design of the Data Center – not just the quantity. “As computing power and chip designs advance, equipment racks double in power density every six to seven years. … Densification of AI server clusters requires a shift from air to liquid cooling, bringing challenges such as site constraints, obsolescence risks, installation complications, and limited sustainable fluid options,” notes online publication T_HQ. “Specialized cooling methods like rear door heat exchangers also become necessary to address maintain redundancy and efficiency.”

The Wall Street Journal compares the rack size that AI data centers require to those in conventional data centers: 50 kilowatts or more per rack, compared with roughly 7 kilowatts per rack. “That means AI data centers need to be built with added infrastructure capable of supplying a much higher amount of power.” “Other needs/challenges for the new (or retrofitted) data centers: power redundancy and resilience amplifies and added stress to ensure robust connectivity and low latency.”

The new data centers are sucking up lots of energy. Can the grid handle it? How does it jive with the push to be more green? These and other issues remain. AI is already having a great impact on many facets of people’s lives. That impact continues to grow. Data centers are essential if AI is to continue its growth. 

To learn more about Miller Electric’s Data Center Solutions and Services, CLICK HERE.

Recent Posts

CATEGORIES

Archives