By Nick Dale, EVP Business Development, Verne Global
The pandemic has had an unforeseen effect on ESG requirements, and in particular on sustainable planning. Instead of tightening money belts, business leaders are instead looking to improve the sustainable and societal impact of business operations. Why? The pandemic exposed risk that was completely unanticipated, shocking investors and executives who had failed to future-proof portfolios and businesses. It was the first real proof-point for sustainability, and with decision-makers unwilling to be caught in this situation again, the economic uncertainty borne out of the crisis is widely being used as an opportunity to realign on sustainable planning. A focus on ESG is not just lip service to social conventions, it’s now good financial policy.
The way forward is clear: as ESG applies financial rather than just moral relevance to planning models, a business making strides towards being carbon neutral is a safer investment than one that isn’t. But even as business leaders strive to make greener choices, many are unaware of the hidden environmental costs of their technologies.
The environmental cost of high intensity compute
Even before the pandemic, enterprises have looked to artificial intelligence to aid in predicting market conditions and business outcomes. Indeed, in January 2020, some 85 percent of financial institutions reported they were using some form of AI (University of Cambridge and World Economic Forum Survey). Adoption of AI has become even more ubiquitous across industry sectors in recent months as businesses have relied more heavily on data to provide facts during periods of uncertainty. So, even as the focus on sustainable planning in terms of technology practices has increased, access to high intensity compute has in tandem become an increasingly integral component of business operations across the board.
Unfortunately, this is not helpful for companies trying to reduce their carbon footprints. Because of the vast and complex data sets that are needed to train models that can identify patterns and generate predictions, the supercomputers that need to be used are extremely power-hungry. Plus, for insights to be relevant, compute often needs to run constantly, with minimal disruption to uptime. As an example of how power-intensive high performance computing can be, the carbon emissions released each day by the ICON weather forecasting algorithm is approximately equal to flying from New York to San Francisco four times (per 2020 University of Cambridge research).
The hidden emissions adding to your carbon footprint
The good news for enterprises looking to implement sustainable technology practices is that there is a standardised method of gauging carbon emissions against which they can measure their efforts. The most widely recognised international accounting and reporting standards for greenhouse gases is supplied by The Greenhouse Gas Protocol (GHGP). It clearly sets out the emissions generated by enterprising operations that contribute to a carbon footprint into three different ‘scopes’.
However, the ‘bad’ news is that while companies may have already been aware of their Scope 1 emissions – those from directly owned or controlled sources – and perhaps their Scope 2 emissions, those indirectly generated from energy purchased by the company, many are only beginning to take note of their Scope 3 emissions, which include all other indirect emissions. This means companies who truly want to commit to sustainable technology practices have to scrutinise activities across the entire value chain.
Over 80 percent of compute doesn’t need to be located near the end user, which means that many enterprises choose to locate it externally, for example somewhere the financial cost of the power is less. However, the GHGP stipulations mean that when a business’s compute is passed on to an external data centre, the emissions generated by it are still counted among their Scope 3 emissions.
Relocating compute is better for business
With the environmental cost of AI being so high, enterprises need to think even harder about where to locate their high intensity compute. It’s becoming clear that the best way to offset the high energy consumption of high intensity compute is to locate it in a data centre that is connected to a power grid fuelled by renewable energy sources. For example, tech giants such as Google are investing in data centres in the Nordics specifically for the renewable energy and increased energy efficiency that can be found there.
What’s more, this kind of sustainable investment doesn’t have to come with a financial penalty. In Iceland, for example, where the grid is powered by abundant geothermal and hydroelectric renewable energy sources, companies using high intensity compute services can be offered a lower cost electricity tariff that is fixed for ten years or more by data centre providers. As well as not enforcing a choice between sustainability and profitability, the fixed costs allow for long-term financial planning around sustainable technology practices.
Focusing on sustainability is no longer a box-ticking exercise, but amounts to smarter business. Re-designing operations as the effects of the pandemic begin to subside is not about building back what you had, it’s about building something better. With the option to locate energy intensive parts of the value chain somewhere they can be powered with reliable, low-cost renewable energy sources, future-proofing a company doesn’t have to come at a cost to its bottom line.