Hugging Face engineer Julien Delavande has developed a tool to measure the electricity consumption associated with interacting with AI models. AI models, which run on energy-intensive GPUs and specialized chips, require significant power for their computational demands. While it is difficult to precisely determine the power usage of these models, experts anticipate that the increasing adoption of AI technologies will result in higher electricity demands over the next few years.
The rising energy needs for AI have prompted some companies to adopt strategies that are not environmentally friendly. Delavande’s tool aims to raise awareness about the energy usage of AI and encourage users to consider the environmental impact. Delavande and the tool’s other developers have stated that even small energy reductions, when scaled across numerous queries, can significantly impact the environment, influenced by the choice of models and the length of output.
Delavande’s tool integrates with Chat UI, an open-source platform compatible with AI models like Meta’s Llama 3.3 70B and Google’s Gemma 3. It provides real-time estimates of energy consumption for messages sent to and from an AI model, expressed in Watt-hours or Joules. The tool also compares the energy usage of models to that of common household appliances like microwaves and LEDs.
For example, the tool estimates that generating a typical email with Llama 3.3 70B consumes approximately 0.1841 Watt-hours, akin to operating a microwave for 0.12 seconds or a toaster for 0.02 seconds. Although these estimates are not claimed to be exact, they serve to remind users that AI interactions come with energy costs.
Delavande and his colleagues emphasize the importance of transparency in the open-source community regarding AI’s energy footprint. They envision a future where energy usage is as prominently displayed as nutrition labels on food, highlighting projects like the AI energy score as steps toward achieving this goal.