Inside the Dirty, Dystopian World of AI Data Centers

Inside the Dirty, Dystopian World of AI Data Centers
Photographs by Landon SpeersAs we drove through southwest Memphis, KeShaun Pearson told me to keep my window down—our destination was best tasted, not viewed. Along the way, we passed an abandoned coal plant to our right, then an active power plant to our left, equipped with enormous natural-gas turbines. Pearson, who directs the nonprofit Memphis Community Against Pollution, was bringing me to his hometown’s latest industrial megaproject.Already, the air smelled of soot, gasoline, and asphalt. Then I felt a tickle sliding up my nostrils and down into my throat, like I was getting a cold. As we approached, I heard the rumble of cranes and trucks, and then from behind a patch of trees emerged a forest of electrical towers. Finally, I saw it—a white-walled hangar, bigger than a dozen football fields, where Elon Musk intends to build a god.This is Colossus: a data center that Musk’s artificial-intelligence company, xAI, is using as a training ground for Grok, one of the world’s most advanced generative-AI models. Training these models takes a staggering amount of energy; if run at full strength for a year, Colossus would use as much electricity as 200,000 American homes. When fully operational, Musk has written on X, this facility and two other xAI data centers nearby will require nearly two gigawatts of power. Annually, those facilities could consume roughly twice as much electricity as the city of Seattle.To get Colossus up and running fast, xAI built its own power plant, setting up as many as 35 natural-gas turbines—railcar-size engines that can be major sources of smog—according to imagery obtained by the Southern Environmental Law Center. Pearson coughed as we drove by the facility. The scratch in my throat worsened, and I rolled up my window.xAI’s rivals are all building similarly large data centers to develop their most powerful generative-AI models; a metropolis’s worth of electricity will surge through facilities that occupy a few city blocks. These companies have primarily made their chatbots “smarter” not by writing niftier code but by making them bigger: ramming more data through more powerful computer chips that use more electricity. OpenAI has announced plans for facilities requiring more than 30 gigawatts of power in total—more than the largest recorded demand for all of New England. Since ChatGPT’s launch, in November 2022, the capital expenditures of Amazon, Microsoft, Meta, and Google have exceeded $600 billion, and much of that spending has gone… [TheTopNews] Read More.
THE ATLANTIC – Technology | Internet & TechnologyFri, March 13, 2026
2 days ago
----- OR -----


Scroll Up