The rapid growth of AI and cloud computing is leading to a surge in the power consumption of data centers, with individual campuses potentially using far more electricity than some U.S. cities and states.
CNBC reports that as AI and cloud computing continue to transform the global economy, the electricity consumption of data centers housing the servers that power these technologies has exploded over the past decade. Now, data centers are growing so large that finding sufficient power to drive them and suitable land to accommodate them is becoming increasingly challenging, according to companies developing these facilities.
Data center developers warn that individual campuses could soon consume more than a gigawatt of power, which is roughly twice the residential electricity consumption of Pittsburgh last year. To put this into perspective, a data center campus with a peak…