Aman Dhamija is a software engineer at Goldman Sachs. She is passionate about green software development. She is a member of the Green Software Foundation. In this article, Aman discusses ways to achieve sustainability in software development.
Green Software Foundation
Green Software Foundation is a non-profit formed under the Linux Foundation to build a trusted ecosystem of people, standards, tooling, and best practices. The foundation’s vision is to change the culture of how we build software such that sustainability becomes a core priority. And it’s just as important as performance, cost, or accessibility.
With net emissions increasing in the IT industry each year, we must put software at the center of our sustainability discussions.
There are two ways of looking at software – software as part of the climate solution and software as part of the climate problem. We want to see software as part of the climate solution, so we want to focus on reducing the carbon emissions from software. Currently, there are only three ways of doing this: using fewer physical resources, using less energy, and using energy more intelligently.
Green software is carbon-efficient, meaning it emits the least amount of carbon possible.
Building green software
It’s important to understand the principles of green software and then put them into practice. The principles are energy efficiency, hardware efficiency, and carbon awareness.
One hour of streaming produces the same carbon emissions as charging seven smartphones. You want to extract the maximum value for every gram of carbon you emit into the atmosphere. Therefore, a carbon-efficient application emits the least amount of carbon possible. So, our goal should always be to be carbon efficient.
Energy is the ability to do work, and it can exist in many forms, and we can convert one form to another. We can think of energy as a measure of the electricity being used. All electricity is not clean. Most electricity is produced by burning fossil fuels; therefore, energy supply is one of the biggest contributors to carbon emissions. So, if we want to be carbon efficient, we must be energy efficient.
One way of improving energy efficiency is to understand the concept of energy proportionality.
The more we utilize a device, the more efficient it becomes at converting electricity into practical computing operations. So, let’s assume you have a device at 0% utilization that consumes 100 Watts of energy. When it reaches 50%, it consumes 180 watts; when it reaches 100%, it consumes only 200 watts. And so from this, we can see that the relationship between the electricity consumed by a device and the rate at which it does useful work is not linear.
So, to be energy efficient, we must ensure that we extract the maximum utilization possible from devices that are consuming electricity.
Depending on your location or the time during the day, electricity is produced from various sources with varying carbon emissions. Some sources, such as wind or solar, are renewable, whereas others, such as fossil fuels, can be quite polluting.
Carbon intensity measures the amount of carbon emissions in grams per kilowatt hour of electricity consumed. To build carbon-efficient applications, we must ensure that we utilize our electricity more when the carbon intensity is lesser.
Carbon intensity varies by location. This is primarily because certain regions have an energy mix with a greater proportion of clean energy. It also varies by time due to the variable nature of renewables.
So, compute more when the carbon intensity is lesser. This is called carbon-aware computing.
This can be achieved by demand shifting, which is possible when you have flexibility in running your workloads. One way of achieving this is by spatial shifting, in which we move our computer to a different location where the carbon intensity is lesser. Another way is by temporal shifting, in which we run our computing or workload at different times in the day such that the carbon intensity is the least. Doing this reduces our carbon emissions and helps accelerate the energy transition to lower carbon sources in the future.
Every device we’ve encountered has caused carbon pollution during its creation and destruction. This is called the embodied carbon emitter device. It is a hidden cost when we want to consider the carbon emissions from the software itself. For most end-user devices, the embodied carbon emissions are much greater than the lifetime emissions from their electricity consumption.
Extending the lifespan of your devices or using cloud computing and increasing utilization are ways of improving hardware efficiency.
Measuring carbon emissions from software
The Green Software Foundation developed the software carbon intensity (SCI) specification, designed to score your application along the dimensions of sustainability and encourage action towards eliminating carbon initiatives.
It asks to bucket the emissions from your software application into two categories. The first one is operational emissions, which arise from running your application. The second one is embodied emissions, emissions from the hardware used to run the software.
SCI = ((E*I)+M) per R
E is the energy consumed by software in kWh
I is the carbon emitted per kWh of energy
M is the carbon emitted through the hardware that the software is running on
R is per device or per functional unit
Applying principles to practice
It’s much better to cache static data, such as images or audio, from a carbon emissions perspective instead of transferring them over the network. The lesser the distance data travels across the network, the lesser the electricity required to move it across, and therefore lesser carbon emissions.
Similarly, the embodied carbon cost also goes down because your data has to travel across a reduced number of servers.
When you want to deploy your application on the cloud, you should choose a region closest to where most of your users are. This is because the network traffic is reduced, and thereby, your carbon emissions are also reduced.
Let’s look at a case study whereby UBS and Microsoft partnered to implement the concept of carbon-aware computing. UBS has a core risk platform on which they run Azure batch workload. They wanted to see if they could shift these workloads to a different time slot during the day would that carbon emissions could come down. They used the Green Software Foundations carbon-aware SDK, an API, and a command line tool that can tell you historical and future carbon intensity data for any compute you run on Azure workload or infrastructure, given that you provide a location and a time period. They found an optimal time slot such that the forecasted carbon intensity was lesser than their current figure. Thus, using and measuring the software, UBS could reduce its carbon emissions.