Sustainable AI must be a priority
The business case for the use of artificial intelligence (AI) and machine learning across industries has been well-proven, as organisations leverage AI-driven capabilities to increase efficiency and efficacy and push the boundaries of innovation.
AI for good is already protecting our personal data from cyber breaches, making healthcare more accessible and effective, and forecasting weather and predicting disasters. In research fields, AI is also changing the way scientists look at climate change and renewable energy. According to a 2018 report by Intel, 74% of 200 environmental sustainability professionals believe AI will help solve long-standing environmental challenges. For example, research by PwC found that the environmental application of AI levers across the four sectors of agriculture, transport, energy and water could reduce global greenhouse gas emissions by 4% in 2030, an amount that equates to the projected 2030 yearly emissions of Japan, Canada and Australia combined.
With all of AI’s many benefits to business and society, it may then come as a surprise that AI itself is a significant generator of carbon emissions. University of Massachusetts research conducted in 2019 found that the carbon emissions from training a single large AI model can be up to 5 times greater than the amount produced during the whole lifetime of a car, or about the same as 300 round-trip flights between New York and San Francisco.
By its very nature, AI is power hungry. It takes a huge amount of computational power to do things such as train deep learning applications and run analytics on massive datasets. Since 2012, the quantity of computing power used for deep learning research has been doubling every 3.4 months, according to OpenAI researchers Dario Amodei and Danny Hernandez – an amount which is equal to an estimated 300,000-fold increase from 2012 to 2018. All of this computational power equates to markedly high energy consumption.
Although mainstream demand for data analytics and machine learning applications is leading to rapid progress in the field, at the same time this also means there is a real risk of AI having a runaway negative effect on the environment. With this in mind, a collective effort needs to be made across all industries to use AI in a way that helps us to change the world for the better without sacrificing the environment.
And there is little logic in attempting to use AI to model complex systems to forecast weather patterns or even alleviate the impacts of climate change, if the very training of these models comes at the sacrifice of the environment it is supposed to help us save.
When it comes to AI, there needs to be a focus on efficiency rather than simply progress. Fortunately, for companies wishing to use AI in an environmentally-conscious way, there are a host of things they can consider in order to ensure their high performance computing (HPC) is working efficiently and to minimise the carbon footprint of their AI. The first step is to ensure that the HPC that powers their innovation is housed in data centres that can efficiently handle the high-density compute involved.
A crucial idea to bear in mind when considering how to make sustainable decisions is that over 80% of hardware is not actually required to be located near the end-user. This means organisations can strategically locate their HPC solutions in data centres somewhere they can be powered by renewable energy sources; countries like Iceland, where 100% of its electricity is generated by hydroelectric and geothermal energy.
As well as this, there are routes to energy optimisation within the data centres themselves, such as how the hardware is cooled. On average, cooling IT equipment equates to 40% of a data centre’s total energy consumed per Cooling Energy Consumption Investigation research (Zhang et al). There are alternative methods of cooling which can be used to reduce the carbon cost of preventing hardware from overheating – such as water cooling. Such methods of cooling are becoming more popular and necessary in data centres as demand grows for ever-increasing compute power.
Additionally, tech giants such as Google are also investing in data centres in countries with perennially cooler climates, such as the Nordic countries or Iceland that can offer ‘green free-air cooling’. In other words, the cooler the natural climate of a country, the less energy needed to keep the equipment cool. Organisations locating their HPC in a data centre that combines these things will drastically reduce the carbon cost of their AI.
Making smarter choices for sustainable AI does not mean economic growth for companies has to be decoupled from carbon emissions. Of course, the priority for organisations growing their HPC is that the applications and infrastructure supporting it have the ability to scale as the data does. Green-minded organisations should therefore look to house their HPC inside a data centre that can offer them the flexibility to manage the development and growth of their compute along with the infrastructure capable of doing so, whilst the centre itself operates in the most environmentally-conscious method possible.
AI is the epitome of human progress, but we must employ and develop it consciously to create a positive and lasting impact on the planet.