AI is ‘anenergyhog,’ but DeepSeek could change that

DeepSeek startled everyone last month with the claim that its AI model uses roughly one-tenth the amount of computing power as Meta's Llama 3.1 model, upending an entire worldview of how much energy and resources itall take to develop artificial intelligence.A
Taken at face value, that claim could have tremendous implications for the environmental impact of AI. Tech giants are rushing to build out massive AI data centers, with plans for some to use as much electricity as small cities. Generating that much electricity creates pollution, raising fears about how the physical infrastructure undergirding new generative AI tools could exacerbate climate change and worsen air quality.
Reducing how much energy it takes to train and run generative AI models could alleviate much of that stress. But itas still too early to gauge whether DeepSeek will be a game-changer when it comes to AIas environmental footprint. Much will depend on how other major players respond to the Chinese startupas breakthroughs, especially considering plans to build new data centers.
aThere's a choice in the matter.a
aIt just shows that AI doesn't have to be an energy hog,a says Madalsa Singh, …