Cerebras Systems raised $5.55 billion in its initial public offering Wednesday, pricing shares at $185 each—well above the expected range—as the AI chipmaker capitalized on surging demand for alternatives to Nvidia's dominance , according to CNBC. At the IPO price, Cerebras is now valued at $56.4 billion on a fully diluted basis , making it one of the largest U.S. IPOs ever and giving the company a market valuation of more than $56 billion , Kiplinger reported.
The offering comes as global electricity consumption for data centers is projected to double to reach around 945 TWh by 2030 , according to the International Energy Agency. Lawrence Berkeley National Laboratory's report finds that data centers consumed about 4.4% of total U.S. electricity in 2023 and are expected to consume approximately 6.7 to 12% of total U.S. electricity by 2028, with total data center electricity usage climbing from 58 TWh in 2014 to 176 TWh in 2023 and estimates of an increase between 325 to 580 TWh by 2028 , the Department of Energy announced.
Can the Grid Keep Up With AI's Appetite?
The energy crunch is already reshaping corporate strategy. Cisco shares soared 17% in extended trading Wednesday after the networking company issued results and guidance that topped Wall Street's projections, announcing it's cutting its workforce this quarter by fewer than 4,000 jobs, representing less than 5% of total employees, with revenue increasing 12% in the quarter ended April 25 to $15.84 billion , CNBC reported. Cisco has taken $5.3 billion in AI infrastructure orders from hyperscalers so far this fiscal year, and raised its full-year order expectation to $9 billion from $5 billion previously , according to Reuters.
Data centers are projected to account for 38% of net electricity consumption through 2037, driven by aggressive hyperscaler capital expenditure and the accelerating energy intensity of artificial intelligence workloads , the National Electrical Manufacturers Association said in a forecast published May 7. Data centers accounted for 17% of electricity demand growth worldwide last year, according to the IEA report, compared with around 50% in the U.S. , Fortune reported.
The geographic concentration is stark. According to a Bloomberg News analysis, data centers accounted for almost 40 percent of Virginia's total consumption in 2024 , Consumer Reports found. That same Bloomberg analysis found that areas with high concentrations of data centers saw electricity prices jump 267 percent over the past five years .
Are Efficiency Gains Real or Just Hype?
Researchers are racing to address the energy problem. Tufts University researchers have unveiled a radically more efficient approach that could slash AI energy use by up to 100× while actually improving accuracy , according to ScienceDaily. By combining neural networks with human-like symbolic reasoning, their system helps robots think more logically instead of relying on brute-force trial and error .
Energy efficiency in large language model inference has improved 100,000x in the past 10 years—demonstrating that accelerated computing is sustainable computing , NVIDIA stated. NVIDIA has a long history of driving performance and energy efficiency, with the number of tokens generated within the same power budget increasing by more than 1 million times from the NVIDIA Kepler GPU in 2012 to the NVIDIA Vera Rubin platform this year , the company announced at CERAWeek.
But the efficiency gains may not be enough. AI-optimized servers are fueling the increase in data center power consumption, with their electricity usage set to rise nearly fivefold, from 93 TWh in 2025 to 432 TWh in 2030 , Gartner reported. In 2025, AI-optimized servers are projected to represent 21% of total center power usage and 44% by 2030, representing 64% of the incremental power demand for data centers .



