The soaring demand for generative AI (genAI) tools is fueling a significant surge in the utilization of power-hungry GPUs and TPUs in data centers, with some expanding from tens of thousands to over 100,000 units per server farm.
As cloud computing and genAI take center stage, new data centers are mushrooming in size. It’s becoming common to witness the construction of new facilities with capacities ranging from 100 to 1,000 megawatts — equivalent to powering anywhere between 80,000 to 800,000 homes as per the Electric Power Research Institute (EPRI).
The energy consumption linked to AI is projected to escalate by approximately 45% annually over the next three years. For instance, OpenAI’s ChatGPT chatbot alone is estimated to consume around 227 million kilowatt-hours of electricity each year while handling a whopping 78 billion user queries.
To provide some context, the energy ChatGPT consumes in a year could suffice for powering up to 21,602 US households based on research conducted by BestBrokers. “Although this constitutes just a mere fraction (0.02%) of the total US households numbering at about 131 million homes; it still amounts to a substantial figure especially when considering that the US ranks third globally in terms of household count,” noted BestBrokers in their latest report.
GenAI models typically exhibit significantly higher energy demands compared to data retrieval, streaming, and communication applications — which have been pivotal drivers behind data center expansion over the past couple of decades according to EPRI’s findings.
With an estimated energy consumption rate of about 2.9 watt-hours per ChatGPT request; AI queries are anticipated to consume ten times more electricity than traditional Google searches requiring roughly around only 0.3 watt-hours each. Furthermore; emerging computation-intensive functionalities like image processing and video generation have no historical precedent as highlighted by EPRI.
2024-10-01 03:15:02
Article from www.computerworld.com