United States

Report: ChatGPT Consumes Electricity Equivalent To 17,000 Average US Households Daily

Recent reports shed light on the staggering amount of electricity consumed by AI systems, raising questions about sustainability and resource management.

Representative image
info_icon

The increasing prevalence of artificial intelligence (AI) technologies has led to a surge in their energy consumption, sparking concerns about their environmental impact.

Recent reports shed light on the staggering amount of electricity consumed by AI systems, raising questions about sustainability and resource management.

According to findings published by The New Yorker, OpenAI's widely-used chatbot, ChatGPT, is estimated to consume over half a million kilowatt-hours of electricity daily to handle approximately 200 million requests. To put this into perspective, the average American household utilizes around 29 kilowatt-hours per day.

Comparing these figures reveals that ChatGPT's energy usage surpasses that of an average household by more than 17 thousand times.

The implications become even more alarming when considering potential future scenarios. If large tech companies like Google were to integrate generative AI technology into every search, it could lead to an annual electricity consumption of approximately 29 billion kilowatt-hours. This surpasses the yearly energy consumption of entire countries like Kenya, Guatemala, and Croatia.

As reported by Business Insider, Data scientist Alex de Vries highlighted the energy-intensive nature of AI, stating that individual AI servers can consume as much power as multiple households combined. De Vries estimated that by 2027, the entire AI sector could consume between 85 to 134 terawatt-hours annually, equivalent to half a per cent of global electricity consumption.

Despite the significant impact on energy consumption, accurately quantifying the electricity usage of the AI industry remains challenging. Variability in AI model operations and a lack of transparency from major tech companies contribute to this difficulty. However, estimates based on data from Nvidia, a leading chipmaker in the AI market, suggest a substantial increase in energy consumption within the sector in the coming years.

Comparing these projections with the energy usage of other high-consumption industries reveals the magnitude of the issue. For instance, while Samsung utilizes close to 23 terawatt-hours annually, tech giants like Google and Microsoft use slightly more than 12 and 10 terawatt-hours, respectively, to power their operations.

As discussions around environmental sustainability gain momentum, the burgeoning energy consumption of AI technologies underscores the urgent need for innovation and regulation to mitigate their impact on global resources.