A new Chinese AI model called DeepSeek has gained attention for its supposed energy efficiency improvements, but recent figures reveal a different story. The model’s chain of thought reasoning approach, which aims to provide sounder answers at the cost of increased computational power, appears to be much more energy-intensive than initially thought.
According to tests, each response from DeepSeek requires around 41% more energy than Meta’s closest competitor, Llama 3.1. This means that while DeepSeek may excel in certain areas, it could lead to a significant increase in energy consumption if adopted widely.
Experts warn that the excitement around DeepSeek could prompt companies to rush into using its chain-of-thought models, even where they are not needed. AI researcher Sasha Luccioni notes that this could void any efficiency gains and lead to skyrocketing energy usage.
As the focus shifts from extractive AI to generative AI, concerns about energy consumption grow. With OpenAI’s o3 reasoning model set to expand access, it remains to be seen whether the benefits of chain-of-thought models outweigh their costs in terms of energy usage.
Source: https://www.technologyreview.com/2025/01/31/1110776/deepseek-might-not-be-such-good-news-for-energy-after-all