In recent times, the increasing demand for computing power has led to a surge in data center facilities across the United States. However, this rapid expansion comes with its own set of challenges, particularly in terms of power usage and water consumption.
The spike in power usage from these data centers poses a threat of cascading power outages, affecting homes and businesses that are connected to the same grid network. Last summer, utility providers in Virginia had to deal with a sudden surge in power demand when a cluster of facilities switched to backup generators as a safety precaution, leading to an excess supply that put the grid infrastructure at risk.
Moreover, the abundance of power has also led operators to set up data centers in areas with significant water constraints. According to researchers at the LBNL, hyperscale and colocation sites in the US consumed a staggering 55 billion liters of water in 2023. The indirect water consumption tied to energy use is even higher at 800 billion liters annually, which is equivalent to the water usage of almost 2 million US homes.
Major tech companies like Microsoft, Meta, Google, and Amazon have all disclosed varying percentages of their water usage coming from areas with water stress. This has raised concerns in drought-prone states like Arizona, Texas, and Georgia, where residents have complained about water shortages, increased costs of municipal water, and damage to water wells due to data center developments.
In response to these challenges, some experts are advocating for alternative techniques to train AI models that are more sustainable. Sasha Luccioni, AI and climate lead at Hugging Face, suggests using methods like distillation or smaller models to build powerful AI models at a fraction of the cost. She believes that the relentless pursuit of greater computing power without questioning the environmental impact is misguided.
One of the key challenges faced by data centers is the heat generated by increased chip density. About 40% of the energy used by an AI data center is consumed by cooling chips and equipment, according to consultants McKinsey. As chips become more powerful, advanced cooling methods are required to prevent malfunctions. This has led to significant investments in cutting-edge cooling innovations, such as using cold water pipes and cooling towers to dissipate heat efficiently.
Microsoft and other operators have adopted closed-loop cooling systems that rely on chillers to cool the water, which is more efficient and less wasteful compared to evaporative cooling methods. These advancements in cooling technology are crucial for ensuring the reliability and sustainability of data centers in the face of increasing power demands and environmental concerns.