+7
_journals/2024-07-07_2237.md
+7
_journals/2024-07-07_2237.md
···
···+[Openvibe](https://openvibe.social/), a "Town square for open social media" that connects Nostr, Mastodon, + Bluesky & Threads coming soon. Shows you one timeline and lets you cross-post. Available on iOS and Android.
+27
_notes/How to Build an AI Data Center.md
+27
_notes/How to Build an AI Data Center.md
···
···+> _This piece is the first in a new series called_ [Compute in America: Building the Next Generation of AI Infrastructure at Home](https://ifp.org/compute-in-america-building-the-next-generation-of-ai-infrastructure-at-home/)_. In this series, we examine the challenges of accelerating the American AI data center buildout. Future pieces will be shared_ [here](https://ifp.org/compute-in-america-building-the-next-generation-of-ai-infrastructure-at-home/).+> We can divide the likely impact of AI on data centers into two separate questions: the impact on individual data centers and the regions where they’re built and the impact of data centers overall on aggregate power consumption.+> <mark>For individual data centers, AI will likely continue driving them to be larger and more power-intensive</mark>. As we noted earlier, training and running AI models requires an enormous amount of computation, and the specialized computers designed for AI consume enormous amounts of power. <mark>While a rack in a typical data center will consume on the order of [5 to 10 kilowatts of power](https://www.datacenterfrontier.com/design/article/55020771/data-center-world-experts-drill-down-for-ai-facility-design-and-construction-case-study), a rack in an Nvidia superPOD data center containing 32 H100s (special graphics processing units, or GPUs, designed for AI workloads that Nvidia is selling by the millions) can consume more than 40 kilowatts.</mark> And while Nvidia’s new GB200 NVL72 can train and run AI models more efficiently, it consumes much more power in an absolute sense, using an astonishing 120 kilowatts per rack. [Future AI-specific chips](https://www.youtube.com/watch?v=1WYJaFDTo4o) may have even higher power consumption. Even if future chips are more computationally efficient (and they likely will be), they will still consume much larger amounts of power.