Dev Dojo
04.10.2025

A Gentle Introduction to Green Coding: Writing Software That Breathes Better

Green coding isn’t an ethical luxury — it’s the natural evolution of good engineering: efficient software is sustainable software. From language and algorithm choices to data handling and workload timing, every technical decision carries an energy footprint. Writing code that “breathes better” isn’t just about speed or elegance — it’s about building a digital world that wastes less and lasts longer.

Written by:
Francesco Di Gennaro

Francesco Di Gennaro

Backend Senior
Article cover image

SHARE

Writing code that consumes (and pollutes) less

Given that you're here, I assume you've already read several articles about green computing and its environmental benefits.

If you haven't, be aware that the entire Information and Communication Technology (ICT) sector consumes a significant amount of electrical energy and generates substantial greenhouse gases. According to a 2024 study, this sector accounted for 4% of global electricity consumption and generated 1.4% of the worldwide greenhouse gas emissions in 2020 alone. Meanwhile, a 2018 study projects that by 2040, the volume of greenhouse gases produced by this sector will rise to 14%.

Green computing is both a movement, a study, and a practice aimed at mitigating those numbers and achieving, at the very least, carbon neutrality in the ICT sector before it's too late.

Much like ogres, the ICT sector comprises many layers, each of which requires its own strategies to become greener (still like ogres, I suppose). Green coding is green computing at the code layer, or, in other words, how to write code that consumes less energy and produces less greenhouse gas, which this article is about.

General Green Languages

Now, suppose you're starting a new work project or the umpteenth personal one, you might have the chance to choose which language or languages to implement it in, and if you do, you can already do green coding by selecting a more energy-efficient one.

First in 2017 and then again, more in-depth, in 2020, a team of computer scientists published studies comparing the energy expenditure, execution time, and peak and total memory allocation of 27 of the (at the time) most popular and widely used programming languages on 10 computationally intensive problems, yielding both obvious and enlightening results.

Among other things, both studies have shown a clear but not definitive correlation between energy expenditure and execution times, as almost all languages ranked in similar positions in both scoreboards or, in other words, languages with low execution times also have low energy expenditure, which shouldn't be surprising, as energy expenditure is power times time, so lowering the time will likely reduce the overall energy expenditure. The lack of definitiveness is that the other parameters besides time, i.e., power, may not be constant across measurements, rendering the correlation finicky.

Regarding memory usage, however, there is practically no correlation between energy consumption and peak memory usage. In contrast, there is a correlation between the former and total memory usage, that is, the cumulative amount of memory allocated by the running program. I suspect that this is because memory operations themselves are energy-intensive; however, I have yet to find evidence corroborating this suspicion.

If you're curious and don't want to read the studies yourself, the three most energy-efficient and fastest languages in both studies were C, C++, and Rust, all three compiled languages that lack garbage collection.

Java, a virtual-machine-powered language with a garbage collector, scored fifth in both energy expenditure and execution time in both studies. The score is surprisingly high, in my opinion, which is comforting, considering there are billions of devices running this language daily, as Oracle continuously reminds us.

Run Java

(Apparently, they no longer do so; I won't know, as I usually install other vendors' JDKs, and from a Linux terminal.)

Meanwhile, JavaScript and TypeScript scored pretty low, ranking 17th and 19th, respectively, in energy expenditure, and 16th and 23rd, respectively, in execution time; all four rankings are consistent in both studies. Sorry, front-end colleagues.

Still, note that, at the time I'm writing this, five years have passed since the most recent of the two studies. Although I don't expect interpreted languages like JavaScript to surpass something like C, the results may differ if this experiment were to be repeated today.

Big Green Notation

Also note that these studies were primarily to find a statistical correlation between execution time and energy expenditure, rather than establishing a causal relationship between a chosen language and the expected energy expenditure of any software implemented with it.

In other words, choosing the "best" language isn't the silver bullet, the do-all end-all solution to make your code green: first because there isn't really one, second because different environments and problems often require other languages, and third because for one computational problem, e.g., sorting a set of data, there could be different algorithms to solve it, dozens of various algorithms, and they may have wildly different execution time and, thus, energy expenditure.

Luckily, people smarter than me have long since come up with a way to label and classify algorithms based upon how much more time (and space, as in, allocated memory, but that's not as important) they require as the input grows, i.e., whether increasing the input tenfold will increase the execution time or not, and if yes whether it will increase it tenfold, too, less so, or more so.

I'm talking about the Big-O notation.

Big O

Now, the Big-O notation is more of a measure of how an algorithm's execution time (and space, keep forgetting that) scales with arbitrarily large inputs, namely in its worst-case scenario, not how generally fast it is with a plausible input for a given domain. For that, only common sense, experience, and benchmarking can help.

Still, this notation provides software developers with a sufficient indicator to decide which solution for a given problem to choose, and you aren't even required to know how to calculate it; you only need to know how to read it.

Here's a very essential primer. In order from best to worst scaling:

  1. O(1) - Constant execution time, regardless of changes in the input's size. Array access is linear.
  2. O(log n) - Logarithmic execution time, which means that an exponential increase in the input size causes only a linear increase in the execution time. Binary search is logarithmic.
  3. O(n) - Linear execution time, which means that a linear increase in the input size causes a linear increase in the execution time as well. Linear search is, as its name suggests, a linear algorithm.
  4. O(n log n) - Quasilinear execution time, scales worse than O(n), but it's the best computer science has come up with for sorting algorithms. Quicksort is quasilinear.
  5. O(nx) - Polynomial execution time, usually due to nested loops like those in simpler sorting algorithms. Usually called quadratic if x equals two.
  6. O(xn) - Exponential execution time, which means that a linear increase in input size causes an exponential increase in execution time. Brute-forcing a password is usually exponential.
  7. O(n!) - Factorial execution time, which means that a linear increase of input size causes a factorial increase in execution time. Solving the travelling salesman problem through brute force is factorial, and it is also ill-advised.

You may be wondering now when you'll ever have to implement or even choose which sorting algorithm to use instead of calling the sort function of whatever language you're using, and you'd be right. Indeed, the choice of which algorithm to use rarely falls on the average software developer in their day-to-day job, and it's mostly up to those who maintain the libraries and the languages themselves, who usually (hopefully) choose the best algorithm anyway.

Instead, the choice involving the Big-O notation that you'll most likely and often find is which data structure to use for a given problem, for the Big-O notation also indicates the execution times of common operations on organised data, like insertion, deletion, and lookup. Choosing the right data structure for the right job can not only avoid future, hard-to-spot bugs but also reduce the overall execution time of the system as a whole and, thus, the energy expenditure for greener code.

For instance, you may want to choose a data structure with slower insertion and deletion but lightning-fast lookup when your code seldom performs the first two operations and very often the last.

Green Practices

But what's next? Once you have chosen a language suitable for your project and the most optimal data structures for your logic, you get to write the code that ties it all together, and you can do so in many ways, some better than others. From implementing something that "just works" but is an unmaintainable tangle of spaghetti code, to something that not only just works but is also sound, maintainable, and not prone to self-combustion.

Thankfully, other people, probably different from the one who developed the Big-O notation, but more intelligent than me, nonetheless, have developed and catalogued practices to consistently produce the latter code instead of the former, like using standard naming conventions and avoiding null references at all costs, practices to implement the best code and thus aptly dubbed best practices.

Wouldn't you agree that 'how green a given piece of code is' is just another metric alongside soundness, maintainability, and reluctance to self-combustion, and thus deserving of dedicated best practices?

If your answer is yes, you're not alone, for (at least) three separate repositories of best practices about green coding and green computing at large exist. All three of which are, for some reason, natively in French, and only two have official English translations. For a sneak peek, these best practices include limiting HTTP calls, storing static data locally, and compressing data before transferring it. You should skim through them if you're curious.

More importantly, if you use SonarQube, the popular tool for static code analysis, you may be interested in the Creedengo (formerly ecoCode) family of plugins, as they add rules based on the green computing best practices of the repositories mentioned above, and whose maintainers curate a list of those practices and to which languages they apply.

Still, always remember that no best practice can wholly substitute testing your application, and in the same way, if you really want to know what part of your code is the most energy intensive and, thus, the most in need of optimisation, the only way is to benchmark your code thoroughly and comprehensively, just as you'd do for testing, and using tools like Intel RAPL or PowerAPI to measure the energy expenditure of said benchmarks.

Scheduled at Green'o Clock

Last but not least, multiple studies have shown that the carbon intensity of electricity, that is, the amount of CO2 produced by generating that electricity, varies significantly across different regions, and more pertinently to this article, over time, following predictable or, better yet, forecastable patterns. For instance, photovoltaic power plants produce more electricity during the day than at night; thus, consuming electricity during the day may result in less CO2 emissions than during the night.

We can leverage this fact to schedule the heaviest computational loads to run when the carbon intensity on the grid is at its lowest, especially since there are SDKs and other resources to help us in this endeavour.

If you want to go bananas (and if we're talking about web applications), you can even distribute your workload across different data centres in various regions, as the CASPER framework proposes.

Conclusions

Green computing isn't a fad, nor is it the next buzzword tech bros use for grifting, but it is what we, as the ICT workforce, nay, as computer nerds, must strive for: a future where software doesn't pollute the atmosphere and kill us all, a greener and more sustainable future.

So, the next time you're about to commit some code, ask yourself: Is it as green as you can make it?

(Credits and thanks to the curators at the awesome green software repository that gave me many resources used to write this introduction.)

💬 Curious about making your stack or development process greener? Let’s talk. Sensei helps teams and companies write software that’s more efficient, sustainable, and high-quality. 👉 Get in touch.

GET IN
TOUCH

Our mission is to turn your needs into solutions.

Contact us to collaborate on crafting the one that fits you best.