EY refers to the global organization, and may refer to one or more, of the member firms of Ernst & Young Global Limited, each of which is a separate legal entity. Ernst & Young Global Limited, a UK company limited by guarantee, does not provide services to clients.
How EY can help
-
Sustainability and environment, social and governance (ESG) services that help protect and create value for business, people, society and the world, from a recognized leader in ESG and sustainability consulting. Explore the depth and breadth of EY services and solutions. Contact EY teams for more information.
Read more
Sustainable AI infrastructure build-out
Addressing the energy, resource and infrastructure dimensions of AI’s ramp up will be essential. New GenAI workloads are expected to spur a tripling of hyperscale data center capacity over the next six years.8
Data center and data transmission networks account for 2%-3% of global electricity consumption and about 1% of global GHG emissions.9 These figures have grown only modestly despite exponential growth in workloads due to greening grids and the shift to hyperscale cloud providers, who have invested in renewables and realized high levels of efficiency. Yet, emissions must be halved, not grow, by 2030 to keep on track for net zero.10
New efficient semiconductor architectures and cooling methods will be important contributors to bending the energy and emissions curve of AI. Innovations in the lab promise to massively reduce the energy required to regulate chip temperatures to maintain performance. Prototypes of neuromorphic chip architectures, which emulate the neurons and synapses of the human brain, are reported to have yielded a thousand-fold reduction in energy consumption.11 Data center operators are deploying a variety of strategies to reduce cooling energy, from locating in colder regions or using waste heat to warm residential districts, to using alternative liquids to water. A few companies are even exploring putting data centers in space.
Decarbonizing GenAI will also depend on data efficiency. The bigger the large language model (LLM), the more energy used in training it. An LLM with 110m parameters emitted 0.64 tonnes of C02 in the training phase, which is about 80% of the annual energy-related emissions of one US home. In contrast, another LLM, with 75b parameters had a training footprint of 550 tonnes, equivalent to the emissions of 70 US homes in a year.12
Still, 60%–90% of emissions are generated by inferencing, running the model on live data (e.g., a GenAI prompt). In response, researchers are creating smaller models and optimizing the trade-offs between training speed and energy consumption.
Water, biodiversity, embodied carbon and other environmental challenges will become increasingly important concerns. The largest data centers can reach 90,000 m2 in size and consume nearly 2ml of water daily for cooling.