ERCIM news 135
ERCIM news 135
ERCIM news 134
ERCIM news 134
ERCIM news 133
ERCIM news 133
ERCIM news 132
ERCIM news 132
ERCIM news 131
ERCIM news 131
Back Issues Online
Back Issues Online

by Reinder Radersma (NWO-I Digital Competence Center)

Green coding – the practice of optimising code for minimal energy consumption – is a direct way for software engineers to reduce the carbon footprint of their work. This article reflects on current trends in energy efficient computing and gives six recommendations for software engineers to implement in their daily practices.

The Dutch Research Council operates nine national research institutes in the Netherlands. To support their researchers with open and reproducible science, a Digital Competence Center (NWO-I DCC) has been set up, hosted by CWI. NWO-I DCC commits to software being open and available to others, but this comes with the obligation to also address any negative consequences, such as a larger carbon footprint. NWO-I DCC therefore promotes green coding to minimise the environmental impact of research. This article is a summary of insights gathered during a mini-symposium on green coding organised in July 2022.

Our current society is unimaginable without computation, and the need for computing power will only increase. Over the next two decades computation capacity is predicted to multiply a million-fold and consequently its energy consumption will double every three years. Given the current energy and climate crisis, efforts to reduce energy usage and therefore the carbon footprint of computing are imperative.

The Dutch Research Council (NWO) promotes green coding to minimise the environmental impact of research.
The Dutch Research Council (NWO) promotes green coding to minimise the environmental impact of research.

High performance computing
Energy takes up a substantial part of the maintenance costs of High Performance Computing (HPC) facilities. So apart from environmental considerations, financial incentives have promoted energy efficiency. For instance, processors have become more energy efficient by increasing the number of GFLOPs (a measure of computer performance) per watt. However, this efficiency gain is not on par with the increasing demand for computation power. Other initiatives to reduce the carbon footprint of HPC facilities are, for instance, the use of heat waste for heating buildings.

Not only hardware but also software is used to lower impact. Energy management software is used to tune HPC clusters for lower energy usage. By reducing the clock speed of Central Processing Units (CPUs) a speed reduction of a few percent can reduce total energy usage also by a few percent. By changing the standard settings to more energy efficient values and giving users access to energy management software such as EAR, the carbon footprint can be further reduced [L1].

Local computing
For local computing (such as PCs laptops and local servers) similar trends are visible. PC, laptop and server processors benefit from the same developments as HPC clusters. Particularly for laptops, efficient processors have been developed to reduce (battery) weight while increasing functional time off the energy grid. Measuring energy efficiency is trickier though. This can be done with wattmeters, but alternatively there are CPUs that can measure their own energy usage (albeit ignoring energy usage by memory, etc.) or processor-specific estimates.

Recommendations for software engineers
When developing and running software, there are many choices that will affect the energy usage of your software. Here are six recommendations for lowering the carbon footprint:

1. Choose a green language
Some programming languages are more energy efficient than others; it depends on the number of operations that underly commands. Compiled languages (e.g., C, C++, Fortran, Ada) are typically more energy efficient than interpreted languages (e.g., Python, Perl, Ruby) [1], but compiled languages are not always practical.

2. Monitor usage
To reduce energy usage of code, a first step would be to monitor energy usage of the system and compare energy usage between different versions of code. For PCs and laptops, directly measuring wattage can be done with a wattmeter placed between the computer and the socket. One drawback is that energy consumption by all processes on the system is measured, which also includes other processes not related to the code under scrutiny. For dedicated servers this method makes more sense and there are also tools developed to measure energy consumption of specific PCI cards, such as General Processing Unit (GPU) boards [2]. Alternatively, libraries exist to perform this task. For Python code the CodeCarbon library gives estimates of energy consumption, based on output from the processors themselves or estimates in case the processors do not monitor energy usage [L2].

3. Use GPUs
GPUs use more energy than CPUs, but have more computational power and can therefore be more efficient. To maximise efficiency, it is important to properly parallelise the tasks on a GPU. A library such as Kernel Tuner [L3] can take care of this job.

4. Recalculate rather than retrieve from storage
For simple calculations, recalculation can be more energy efficient than retrieving a previously stored value from local storage, RAM, or even cache [3]. The complexity of a calculation and the location of storage are important determinants of whether recalculation or retrieving from storage would be most efficient. Since storage location will differ per system and even depend on other tasks running simultaneously on the same machine, it is difficult to distill any general rules. When energy usage is monitored, it can pay off to compare recalculation versus retrieving from storage.

5. Choose a green cluster
Some computing clusters are much more efficient than others. Even within the top ten of greenest HPC clusters there is a twofold difference in energy efficiency [L4]. Some countries or HPC facilities use energy from carbon neutral sources. Check the Green500 [L4] or use, for instance, the CodeCarbon library [L2] to find a green computing cluster.

6. Optimise where it counts
Optimising software for energy efficiency can be very time consuming, given the large number of factors affecting energy usage and its effect on the carbon footprint. Allocating your time to make projects that are computationally intensive less energy demanding and software that has potentially high usage can pay off.


[1] R Pereira, et al., “Energy Efficiency Across Programming Languages: How Do Energy Time and Memory Relate?”, in Proc. of 10th ACM SIGPLAN SLE, 256-267, 2017.
[2] J.W. Romein, B. Veenboer, “PowerSensor 2: a fast power measurement tool,” IEEE ISPASS, 111–113, 2018.
[3] M. Pi Puig, L.C. DeGiusti, M. Naiouf, “Are GPUs non-green computing devices?”, JCS&T, 18, 153–159, 2018.

Please contact:
Reinder Radersma, NWO-I Digital Competence Center, CWI, The Netherlands
+31 20 5924049
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: July 2024
Special theme:
Sustainable Cities
Call for the next issue
Image ERCIM News 131
This issue in pdf


Image ERCIM News 131 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed