You’ve probably seen the headline by now: China’s DeepSeek built a top-notch, high-performing artificial intelligence chatbot at a fraction of the cost of its biggest competitors. Less money, less energy, more AI power.
Worth keeping in mind: Data centers consumed about 4.4% of all US electricity in 2023. By 2028, that’s expected to increase to up to 12%. So energy and tech companies alike have been racing to build the data centers and grid advancements needed to keep the lights on without sacrificing this so-called next wave of AI.
The staggering efficiency of DeepSeek’s model, though, has cast doubt on the widely accepted reality that the coming AI revolution will require massive amounts of energy, putting strain on an already overloaded grid. What if…all that AI could come to life with less energy? The question of future AI energy demand tanked some of the sector’s biggest names. On Monday…
- Vistra shares dropped 30%.
- Talen stock took a 20% hit.
- Constellation Energy shares fell 20%.
- Shares of nuclear tech company Oklo dipped 22%.
- And SMR maker NuScale shares were down 25%.
But…what if there was a but? What if we still do need more energy to power AI?
Enter: Jevons Paradox, in which increased efficiency (like the kind DeepSeek brings to the table) leads to even greater consumption. The analysts at JPMorgan suggested it’s possible Jevons Paradox could come into play re: AI energy demand. If AI becomes cheaper and easier to operate, more people could adopt the tech into their everyday lives, leading to broader overall adoption.
What do you think? Are recent—and massive—investments into data centers and energy infrastructure to power the AI economy (looking at you, $500 billion Stargate project) premature? Or will we need those resources as AI becomes more popular? How will this seismic shift in AI development costs impact the energy sector?
Tell me what you think in the comments.