There’s a keen eye or a keen ear on the impact of some of those technologies on this great Earth that we live upon. Climate change has been on the world’s agenda for some time now.
We know we’ve been consuming more resources than the planet has to offer. We’re polluting it, warming this planet up; we’re consuming everything on it. By 2030, it’s predicted that 21% of all the energy consumed in the world will be by IT. 80% of the world’s electricity is still fossil-based. So, we must think, how much is your digital transformation costing future generations, and what do we do?
This isn’t necessarily about the steps we take at home and being a good green corporate citizen. This is about how we architect our systems, leverage the network, shuffle the data around that fuels all these digital outcomes, and get those digital jobs done, all while ensuring that we’re doing the best by dear Mother Earth. It’s been said that change is the only constant for a very long time. And this is a very familiar concept in it.
Digital Transformation is about ongoing, relentless change; radically changing how we build, deploy, embrace, and consume applications.
The IT landscape is rapidly changing, with organizations racing to the cloud, embracing SAAS, using APIs to create new business models, and innovating at a velocity deemed impossible, only a few short years ago. This change continues at a breathless pace today, with every organization rapidly innovating in this digital space. Consumers are becoming more aware of the social impact of this change. 85% of consumers have shifted their purchasing power and behavior to prioritize sustainability in the next five years. And more than a third of those consumers have stated that they will pay more for sustainable products. So as expectations around sustainability climb, companies will face significant pressure to prove their sustainability credentials and continue to make it part of their key value proposition.
Because of all this change in the IT landscape, application design is evolving to support it. We’re seeing traditional monolithic applications are broken down, more decoupling, more distribution of where those application components are hosted and running and how they’re being leveraged. New engineering practices emerge, designed to increase the pace of digital innovation and take advantage of all this change around us. APIs and microservices are leading the charge in enabling so much of this.
Climate change, unfortunately, also seems to appear to be a constant, with everyone racing to the cloud. The pressure is now on for these cloud providers to do better regarding sustainability and impact. The top 10 largest cloud providers accounted for 70% of all IT spending on cloud infrastructure. According to Gartner, cloud sustainability initiatives will start to lead and begin with these cloud providers, which means some of the world’s largest data center operators, and are very critical to helping us to reduce the effects on climate from all this growth in it. While essentially all cloud providers already have sustainability initiatives in place, their progress in meeting carbon reduction goals and strategies for achieving net zero carbon emissions does vary quite wildly.
We’re very rapidly creating and leveraging more APIs. I’m not going to claim that APIs prevent climate change. No single thing does. What I am going to claim, though, is that they have a key role to play. And we need to be making technical and architectural decisions.
Cisco estimates that the global web traffic in 2021 exceeded 2.8 zettabytes. Now consider that 83% of this traffic is through APIs. So, APIs have a big role in helping us address some of these climate change and sustainability initiatives. Building Better APIs isn’t just good for the planet and our conscious; it’s good for business too. The more we can architect to reduce energy consumption, the more we can reduce our costs and impact.
Designing and Planning APIs
APIs must be well designed in the first place, useful to the business outcome, and intentionally designed for discovery and reuse. We also should be ensuring that every API does its job efficiently.
For example, sending 100 fields of data when most of the time, consumers are only consuming perhaps ten fields; we’re wasting resources. We’d be sending 90 unused and unhelpful fields of data every single time that API is called. So perhaps we need to think about designing two versions of API, a simple customer API and a detailed customer API. Maybe we should also consider more modern protocols like Graph QL, GRPC, or other means to ensure that only the relevant data is moving across the wire, consuming energy resources, and being delivered into these digital outcomes.
Deployment of APIs
Deployment targets for these APIs will also matter. Cloud vendors have significant R&D budgets to make their energy consumption as low as possible. However, the annual energy usage of Amazon, Google, Microsoft, Facebook, and Apple combined is more or less the same as the energy consumption of New Zealand as a whole. It’s not as simple as moving to the cloud, and we’re done. Just how renewable are their energy resources? The more cloud vendors see this as a factor in our valuation and purchasing decisions, the more we will compel them to prioritize sustainability and efficiency.
Technologies for APIs
We must consider technologies like API gateways and service mesh to optimize and reduce network traffic. We also should think more about how we leverage orchestration and containerized environments to optimize resource consumption. We can scale up when the traffic is there, and we need to service at a higher scale, and we can wind down those services dynamically when the traffic subsides.
Manage and Observe
Policies and plugins exist to control how many requests a client makes in any given period, like rate limiting, caching, and others. Observability gives control over what’s running, how it’s running, and insights needed to optimize and right-size our API architecture, hopefully dynamically. Should we redeploy some APIs? Are APIs sitting idle and unused, and should we decommission them? If there is a performance bottleneck, investigate the cause, and if appropriate, consider refactoring eight the API implementation to be more efficient. With some of these green initiatives in mind, API Management is one of the necessary tools for reducing the carbon footprint of your API estate. Its sole purpose is to give you that visibility and control over those APIs and how they’re consumed to put you in a much better position to make some of these choices architecturally, implementation-wise, and where you deploy the services to which sustainability is in mind.
There are areas across the full API lifecycle that, when carefully considered, can result in overall API success, which is what you want for your businesses, benefits in terms of sustainability, and ultimately, judicious and careful use of finite planetary resources. Beyond our APIs, more companies are increasingly offering APIs that can help you make products and processes greener. You can embed these APIs broadly through the API economy into your workflow and the services and products you produce.
One such example is Patch, which offers an API first carbon removal opportunity, an API that enables businesses to programmatically calculate their emissions and immediately offset them by investing in a range of vetted projects. Shopify, as an example, is an e-commerce platform that offers carbon offsets. So when you purchase a product with Shopify, you can optionally opt-in to offset your carbon emissions, and they will invest that into initiatives to help sustain our planet.
It’s important to note here that we need to do more than just compensate for our own emissions. But it’s a better starting point than nothing. Offsetting emissions means we counter the damage we do elsewhere in one place, resulting in a net zero ecosystem in which we don’t make things worse overall. It’s not the same as reducing our emissions in the first place and making things better.
There are APIs that you build to power your business, hopefully in a sustainable fashion. How you design them, where you deploy them, and how you manage them is going to matter to the planet that we live on. We have all the green building blocks ready for us to innovate. On top of that, the missing ingredient is us.
Whether you’re motivated by your conscience, or the fact that more efficiency means higher profits and a better outcome for your business, the planet asks two things of you; acknowledge and accept that there’s a problem, not just that global warming exists, but that it is a large and growing part of the problem. We need to think about how we build and use technologies with the same level of guilt and then change and take action. The next you build or version an API, build it following green engineering principles. The next time you break down a monolithic application into microservices, remove unnecessary network hops, bring this up in design sessions and discussions and architecture review boards, educate your colleagues, and do more research yourselves.
It’s time to make this a default behavior. It’s time to accept, brainstorm and challenge each other that, as technologists, perhaps there are better ways to build applications and connect systems than we previously considered.