So, if you do the math, every day ChatGPT is using enough energy to keep that light bulb running for more than 9,500 years. Google 8estimatesthe average text question to its chatbot Gemini uses slightly less, about 0.24 watt-hours. Part of the challenge is that AI systems demand a lot more resources than earlier computing models.
Sasha Luccioni, Climate and AI Researcher: According to the studies that I ran on 9open-source models, it's 30 times more energy for a 10generative model compared to an 11old-school, like traditional model for a task like web search. We're still using Google, we're still using Bing. The fact that we're switching out tasks that were traditionally done in a much more kind of efficient way with generative AI and then 12multiplied by the amount of people that uses these tools every day, that's what really worries me because the 13interfaces are the same, but the 14backend is so much more energy and 15resource intensive, and we don't see that.
CNN Narrator: So, for consumers out there who are trying to be 16mindful about the impact of their AI usage, how do you 17go about that?
Sasha Luccioni: 18The good rule of thumb, if a model does a single task, it's going to use a lot less energy. For a lot of people, it's become ChatGPT 19for just about anything. And that's where environmental costs add up. We should be using multiple platforms, multiple tools. And I think that that's a healthy practice to have as a user.








