Reducing CO2 Emissions in AI – moving towards Green AI and a sustainable future

October 31, 2024

Artificial Intelligence (AI) is now an essential part of modern technology, driving everything from voice assistants to advanced data processing systems. While AI offers numerous benefits, it also carries an environmental burden in the form of carbon emissions (CO2). Training and running AI models also require substantial energy, particularly in data centers where vast amounts of computation are performed. As AI usage continues to grow, reducing its carbon footprint has become more urgent.

While existing tools monitor real-time emissions during machine learning (ML) processes, they only provide insights after the model has been already trained. The goal of the FAME partner Jožef Stefan Institute (JSI) is to develop a system that predicts the emissions before the start of training the model. The system will allow developers to reduce the environmental impact of ML models by predicting CO2 emissions in advance. By knowing the energy consumption and emissions upfront, developers can make more environmentally conscious choices from the start. This proactive approach empowers them to make more sustainable decisions before any model is built.

The research involves generating synthetic datasets and utilizing specialized tools to monitor emissions during both the training and testing stages of various ML models. To make the predictive tool more user-friendly, a web application has been developed. This platform allows users to select different ML models and receive immediate estimates of their CO2 emissions. These estimates are based on factors such as dataset size, hardware specifications, and geographic location (since energy sources differ by region). For example, running a model in a country powered by renewable energy would produce fewer emissions compared to one that relies on fossil fuels.

The approach goes beyond simply measuring energy usage estimating CO2 emissions —it empowers developers to actively reduce it. By selecting more energy-efficient algorithms, optimizing hardware, and tweaking model settings, developers can significantly reduce the energy demands of AI projects. This could transform industries that rely heavily on AI by helping them balance high performance with sustainability.

This work extends beyond merely lowering energy consumption. It fits into a broader movement toward Green AI, which focuses on creating more sustainable AI practices without compromising performance. By using predictive tools, we could see AI development become more environmentally friendly, paving the way for a more sustainable future in technology. As the industry continues to prioritize performance, it’s time to place equal importance on sustainability.

The work is related to FAME WP5: Trusted and Energy Efficient Analytics. For more information, read the conference article and check out the conference paper presentation below:

Hrib, I., Topal, O., Šturm, J., Škrjanc, M. Measuring and Modeling CO2 Emissions in Machine Learning Processes. Conference Information Society 2024, 7–11 October 2024, Ljubljana, Slovenia. DOI: https://doi.org/10.70314/is.2024.sikdd.23.

Video presentation. 

By Tanja Zdolšek Draksler, Oleksandra Topal, Ivo Hrib, Jan Šturm, Maja Škrjanc

More info:
IJS