Futuristic holographic display of oil and gas industry data and analytics

Why data center industry operators need to keep up with growing AI demand

Data center industry participants can develop new strategies to address the unique needs of AI as the technology goes mainstream and demand grows.


In brief:

  • AI-driven applications, including their underlying compute and storage requirements, are positioned to drive the next phase of data center industry growth.
  • Questions for operators remain around location priorities, technical facility requirements and eventual market structure.
  • Developing proactive, flexible strategies around expansion optionality, facility design and energy usage is essential for the data center industry

While artificial intelligence (AI) has entered the mainstream awareness, it is certainly in its infancy as it relates to size, reach, and impact. This is creating pressure on the data center industry to stay ahead of demand by planning for multiple potential scenarios around the performance expectations, government regulation, and the future shape of the AI industry.

AI image generators and natural language processors have captured the imaginations of millions of people around the world. These breakthroughs were the evolutionary product of decades of research and programing.  However, several compelling products and capabilities launched in the past several months, such as ChatGPT and others, that have led to near overnight adoption by millions of users.

Building on previous paradigm-shifting technologies such as mobile and cloud, AI represents the next era of potential demand growth and disruption for the data center industry. Already, some companies are looking to reconfigure their data centers to account for the additional computing power – and related energy consumption and cooling designs – that AI requires. All data center industry participants, however, need to be proactive in approaching this trend, building strategies to address the shifting landscape and staying ahead of future requirement evolutions.

Figure 1: The evolution of AI models

AI is the product of the evolution of increasingly dynamic and self-optimizing models that are made possible via the explosion in the amount of data available for ingestion, study, and AI model training.

The evolution of AI models

AI capabilities are more democratized than previously thought

The democratized nature of AI developments and capabilities has surprised many industry participants and observers, creating a wider array of companies, beyond the largest technology companies with the most resources, that may need access to AI-ready data centers. Startups are releasing models that are roughly on par, performance and accuracy-wise, with those that the largest technology companies have made publicly available. For example, Midjourney, the text-to-image generator, is now the largest Discord server in the world with over 10 million users and is one of the largest GPU users globally, according to its CEO. The company is reported to have fewer than 100 employees.

AI developments and benefits could still accrue to the largest players in the technology industry, but until that happens (if it does), it adds another layer to the uncertainty for data center operators and public cloud players around potential target customers and potential tenant bankability.

Data center industry planning for a world of increased demand volatility

While AI is already positioned to drive the next phase of data center industry growth, the challenges of data center capacity planning are likely to be more difficult than ever, starting with increased demand volatility.

For the past decade or so, public cloud providers, and by extension the entire data center industry, have had difficulty accurately planning for capacity needs in the ever-dynamic enterprise cloud ecosystem. Now, with AI, demand has the potential to be much more volatile, with applications seemingly able to take hold and capture millions of users almost over-night. The space remains in the very early stages of adoption, where growth rates and volatility are often the highest. Additionally, the consumer market may now be a major demand driver, and demand planning for consumer virality at global scale further complicates capacity planning.

The AI market structure has a wide range of potential outcomes

How market power within the AI market evolves is another key variable. With the recent influx of AI capabilities that have very clear monetization potential, such as AI-driven productivity tools, personal assistants, and content generators, we are likely headed into the first cycle of consumer and commercial AI product investment, launch, competition, and consolidation. There are several key questions that could significantly affect the future structure of the data center industry:

  • Who will be the main players going forward, and will the players be the same names that are prominent in the public cloud space today?

EY viewWe are currently in a period where technical capabilities are similar across market participants. This will likely lead to a competitive cycle where winners and losers emerge. In many cases, simple product/market fit and distribution will be key differentiators. Over time, it is unclear whether proprietary training data, modeling techniques, or performance advantages will emerge among market participants. And, if any of these dynamics materialize, how sustainable will they be?

  • Will market participants choose to build on the existing public clouds, will they choose to deploy on their own infrastructure, or will new cloud service providers emerge focused specifically on the AI market?

EY view: Existing public clouds provide significant benefits such as scalability, pre-packaged and integrated AI capabilities and tools, and global reach — but these come at a cost. These costs generally grow directly in line with product usage; in other words, every time a model is run, it requires compute resources that have direct costs. Further, these clouds have historically been designed to prioritize latency and redundancy. It is unclear the degree to which these elements are needed for AI. Without the benefits of operating leverage as demand grows, AI market participants will need to carefully consider the trade-offs of building on the public cloud.

  • How will user performance expectations evolve over time, and will users be willing to pay more for different levels of service?

EY view: Today, it is common for the most popular AI services to “run out of capacity,” at which point users are forced to wait hours, sometimes days, for results. For now, where services are free or low-cost, this may be acceptable, but this may not be acceptable for paid services in the future. Ultimately, the marriage of products and end markets will dictate the importance of real time response rate for each AI-driven application. It is not a fait accompli that existing infrastructure is best positioned to serve or even capable of handling future needs.

  • How will AI capabilities be regulated and governed?

EY view: Issues around accuracy, bias, copyright infringement, citation, and many other topics are already being raised. In such situations, large, established companies generally have more to lose and less to gain than startups, which could provide an opportunity for competitive disruption. We expect a complicated debate around these issues to play out across the globe both in the courts and in the court of public opinion.

EY-Parthenon teams help data center industry participants reach their full potential by developing proactive and flexible strategies to address the rapidly evolving market landscape. We assist our clients from strategic development through execution and implementation.

Monitoring the relative rate of change for both demand and technological advancement

While the number of parameters and data sets going into training AI models will likely continue to grow, the ability to then “compress” the models, pruning them down to the specific parameters that a specific workload needs for broader use (i.e., inference), also continues to advance. Some models have already been optimized to run on consumer hardware (e.g., laptops and even mobile devices). There may be further optimization at the chip level as more of the models can be “hardcoded” into the chip, similar to how ASICs were designed exclusively for bitcoin mining. Large technology players with silicon design capabilities may be at an advantage, as they can optimize their silicon to run their own models in their own data centers.

Given these evolving market dynamics, all participants within the broader data center ecosystem, including operators, investors, and service providers, should assess the potential impact of AI on the industry. To be successful in this environment of increased uncertainty, participants will need to continuously assess the rapidly evolving market landscape and build strategic frameworks that enable proactive decision-making. 

Four questions for the data center industry

Four questions that data center industry participants can ask as they prepare to address the growth in AI are:

  1. What are the unique data center requirements of AI workloads, and how does this impact downstream decision-making such as location selection, facility design, and overall capacity planning?
  2. What players will be driving AI-driven data center demand growth, and how will their data center capacity preferences evolve over time?
  3. Where can capacity be secured that can be positioned to serve the likely requirements of AI-related workloads and applications?
  4. What are the biggest industry unknowns, and what are our plans under different market evolutionary scenarios?


Does your strategy address the growth in AI?

Get in touch to learn more

Summary

AI has moved squarely into the public consciousness in recent months. For the data center industry, the growth of AI will cause a surge in demand for computing resources. Industry leaders need to develop a strategy now for addressing that demand amid several possible scenarios.

About this article