The covid-19 pandemic has triggered a lot of thought about what we humans are doing in our lives, our jobs, our society, and to the environment. In the early months of the pandemic, stories about ‘nature healing itself’ spread like wildfire in the news and on social media. Fish made their way back to now-unpolluted waterways and wild animals repopulated national parks. We saw images of sheep walking on city streets, greenhouse emissions going down, and major cities with the cleanest air in decades. All of this fed into the sliver of hope that if the pandemic was keeping humans locked down, nature could take the time to fix what humans had broken.
Unfortunately, most of that didn’t happen. Yes, there were temporary changes to air quality and some emissions did go down, but simultaneously, people who had the benefit of being locked down were ordering more things online (which led to increases in the use of plastic and packaging) and using much more energy at home. Now, two years later, we’re back to essentially the same levels of greenhouse gas emissions as before covid struck.
In our personal lives, most of us recycle in the hope that our small contribution might slow down anthropogenic climate change. Some of us have also purchased hybrid or fully electric vehicles, installed solar panels, learned to shut off lights in unoccupied rooms, reduced our general use of plastics, and meat consumption – an industry that uses unbelievable amounts of water. At work, at least in our industry, agencies have pledged to be more environmentally friendly, less plastic and packaging consumption, less air travel and less carbon footprint overall. Clients have made similar commitments, even to the point of asking agencies to track their carbon costs in the work we do for them. But how many of us consider the effect of technology on the environment?
Industry sources estimate that information technology accounts for as much as 3.8% of global carbon emissions. This is higher than the aviation industry, which is notorious for its carbon footprint. The Association for Computing Machinery (ACM) estimates that data centres – which hold all our data in the cloud, run our apps, and so on – are using twice the amount of electricity today compared to 10 years ago, and most of this electricity is generated from fossil fuels.
This dramatic increase in electric consumption is rooted in a few obvious causes: we’re sending more emails than ever and spending more time on social media, the data for which are collected on cloud servers that sit in data centres. However, we are also asking these data centres to run more powerful processes – artificial intelligence applications, bitcoin miners, and blockchain transactions which use electricity with every calculation and keystroke. The NFTs that you’re considering buying? The avatar you're developing for the metaverse? All involve massive amounts of data that are stored, for the most part, in data centre computers which require electricity to operate both the servers and the extensive cooling systems needed to keep those servers working.
There are also ‘hidden’ carbon emissions that no one thinks about when it comes to the technology we consume. When we launch a streaming video on our TV, we may think about the energy required to run the television, but do we also think about the energy needed to power the servers housing that video, or to run the myriad background applications that combine to serve as the platform for our favourite movie or show? When we pay for a purchase using cryptocurrency, do we think about the computing energy required to execute that transaction? Has our morning cappuccino just caused a cavalcade of computers in one or more data centres to execute millions of lines of programing to record and process the purchase? And if those data centres are dispersed worldwide, do we consider the carbon costs for the transmission lines to connect them all, for the power regulators to prevent surges and spikes that could corrupt the data, or for the security systems needed to keep all our data safely guarded? A ‘simple’ program or transaction executed from the palm of our hand on our smartphone or tablet could end up involving hundreds of computers housed in dozens of data centres, each of which demands increasing amounts of energy, a great deal of which is still produced using fossil fuels.
There is some hope that the future of technology will be greener. A lot of ‘green-tech’ startups are using innovative technologies to solve today's carbon emissions problems. Large companies like Microsoft and Google are also trying to find ways to reduce or offset their carbon footprints, with goals like reaching carbon neutrality in the coming decades. In the meantime, each of us can do our part by being aware of environmental costs of technology and seeking to conserve energy through smarter tech. We can also actively support – with our votes and our purchasing power – those government and corporate programs seeking to slow the effects of anthropogenic climate change.
Veronica Millan is global chief information officer at MullenLowe Group