The world needs more electricity—but don’t blame AI, Microsoft president Brad Smith says


Hello and welcome to Eye on AI. In this edition: Microsoft President Brad Smith on AI’s power demands; the U.S. imposes new export controls on chipmaking technology; a new milestone for distributed AI training; and are LLMs hitting a wall? It’s complicated.

I grew up watching reruns of the original Star Trek series. As anyone who’s ever seen that show will know, there was a moment in seemingly every episode when the starship Enterprise would get into some kind of jam and need to outrun a pursuing alien spacecraft, or escape the powerful gravitational field of some hostile planet, or elude the tractor beam of some wily foe.

In every case, Captain Kirk would get on the intercom to the engine room and tell Scotty, the chief engineer, “we need more power.” And, despite Scotty’s protestations that the engine plant was already producing as much power as it possibly could without melting down, and despite occasional malfunctions that would add dramatic tension, Scotty and the engineering team would inevitably pull through and deliver the power the Enterprise needed to escape danger.

AI, it seems, is entering its “Scotty, we need more power” phase. In an interview the Financial Times published this week, two senior OpenAI executives said the company planned to build its own data centers in the U.S. Midwest and Southwest. This comes amid reports that there have been tensions with its large investor and strategic partner, Microsoft, over access to enough computing resources to keep training its advanced AI models. It also comes amid reports that OpenAI wants to construct data centers so large that they would each consume five gigawatts of power annually—more than the electricity demands of the entire city of Miami. (It’s unclear if the data centers the two OpenAI execs mentioned to the FT would be ones of such extraordinary size.)

Microsoft has plans for large new data centers of its own, as do Google, Amazon’s AWS, Meta, Elon Musk’s X AI, and more. All this data center construction has raised serious concerns about where all the electricity needed to power these massive AI supercomputing clusters will come from. In many cases, the answer seems to be nuclear power—including a new generation of largely unproven small nuclear reactors, which might be dedicated to powering just a single data center—which carries risks of its own.

On the sidelines of Web Summit in Lisbon a few weeks ago, I sat down with Microsoft President Brad Smith. Our conversation covered a variety of topics, but much of it focused on AI’s seemingly insatiable demand for energy. As I mentioned in Eye on AI last Tuesday, some people at Web Summit, including Sarah Myers West of the AI Now Institute—who was among the panelists on a main stage session I moderated debating “Is the AI bubble about to burst?”—argued that energy demands of today’s large language model-based AI systems are far too great. We’d all be better off as a planet, she argued, if AI were a bubble and it burst—and we moved on to some more energy efficient kind of AI technology.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *