As organisations in Australia scale the use of AI, they are also conscious about minimising the associated carbon emissions. A new study commissioned by Amazon Web Services (AWS) and completed by Accenture shows that an effective way to reduce carbon emissions is by moving certain IT workloads, such as compute-heavy AI workloads, from on-premises to the AWS cloud. The study is based on migrating identical workloads from simulated on-premise data centres to AWS.
Migrating compute-heavy workloads, including compute-heavy AI workloads, in Australia to AWS can result in a carbon emissions reduction of up to 94% compared to simulated on-premises data centres. This is attributed to AWS’s utilisation of more efficient hardware (24%), improvements in power and cooling efficiency (31%), and additional carbon-free energy procurement (39%). The study also shows that optimising these migrated workloads on AWS by leveraging AWS’s purpose-built silicon can increase the total carbon reduction potential to up to 99%.
Accenture was guided by the International Organization Standardization (ISO) Software Carbon Intensity (SCI) standard to analyse the carbon footprint of representative storage-heavy and compute-heavy (AI) workloads, and then went beyond that by considering the effect of carbon-free energy when running workloads both on-premises and AWS. The purpose of the SCI score is to increase awareness and transparency of an application’s sustainability credentials. This is one of the first times a cloud provider has used the SCI specification to perform an analysis of this type.
“Our research shows by leveraging AWS's focus on efficiency across hardware, cooling, carbon-free energy, purpose-built silicon, and optimised storage, organisations can significantly reduce the carbon footprint of their AI and machine learning workloads,” said Sanjay Podder, Global Lead for Technology, Sustainability, and Innovation at Accenture. “As the demand for AI continues to grow, AWS's sustainability efforts will play a crucial role in helping businesses meet their environmental goals while driving innovation.”
Purpose built silicon
One of the most visible ways AWS is innovating to improve energy efficiency is through an ongoing investment in AWS chips. Running generative AI applications in a more sustainable way requires innovation at the silicon level with energy efficient hardware. To optimise performance and energy consumption, AWS developed purpose-built silicon like the AWS Inferentia chip to achieve significantly higher throughput than comparable accelerated compute instances.
AWS Inferentia is AWS’s most power-efficient machine learning inference chip. Our Inferentia2 machine learning inference chip provides up to 50% more performance per watt and can reduce costs by up to 50% against comparable instances. According to Accenture, these purpose-built chips enable AWS to efficiently execute AI models at scale, reducing the infrastructure footprint for similar workloads and resulting in enhancing performance per watt of power consumption.
Relentless focus on energy efficient infrastructure
Through innovations in engineering—from rack layouts to electrical distribution and cooling techniques, AWS focuses on high energy efficiency across its infrastructure. AWS optimises resource utilisation to minimise idle capacity, and continuously improves the efficiency of its infrastructure.
After powering AWS’s server equipment, cooling is one of the largest sources of energy use in AWS data centres. To increase efficiency, AWS uses different cooling techniques, including free air cooling depending on the location and time of year, as well as real-time data to adapt to weather conditions. Implementing these innovative cooling strategies is more challenging on a smaller scale on-premises.
AWS’s latest data centre design seamlessly integrates optimised air-cooling solutions alongside liquid cooling capabilities for the most powerful AI chipsets, like the NVIDIA Grace Blackwell Superchips. This flexible, multimodal cooling design allows AWS to extract maximum performance and efficiency whether running traditional workloads or AI models.
Decarbonising with carbon free energy
According to the study, AWS’s additional procurement of carbon-free energy in Australia contributes 39% towards carbon emissions reduction for compute-heavy workloads. Aligning with Amazon's global commitment to achieving net-zero carbon emissions across all operations by 2040, AWS is rapidly transitioning its global infrastructure to match electricity use with 100% carbon-free energy. Amazon met its 100% renewable energy goal seven years ahead of schedule in 2023, and continues to be world’s largest corporate purchaser of renewable energy.
In Australia, Amazon has invested in seven renewable energy projects, which are helping power Amazon’s Australian operations, including Amazon data centres, and fulfilment centres. From 2020 to 2022, Amazon invested an estimated AU$467 million in renewable energy projects in Australia.
These projects include two operational solar farms in Suntop and Gunnedah in New South Wales, a wind farm in Hawkesdale, Victoria, a solar project in Wandoan, Queensland, and three rooftop solar projects on Amazon facilities in Melbourne and Sydney. Altogether, once operational, the projects are estimated to generate more than 1,000 gigawatt-hours (GWh) of clean power - enough to power more than 175,000 Australian homes.
Considering an estimated 85% of global IT spend by organisations remains on-premises, a carbon reduction of up to 94% for the same workloads migrated to AWS is a meaningful sustainability opportunity for Australian organisations. They can use AWS’s services knowing AWS is continuously innovating to ensure the underlying AWS infrastructure is more energy efficient.
Australian organisations can also leverage these digital technologies and AI capabilities to drive additional sustainability innovations. Helping Australian organisations to use AI to meet climate goals at the speed, scale, and urgency our planet requires. This includes using data and cloud to monitor and reduce food waste, and dynamically optimize for battery connections into the grid, along with a vast range of other cloud-enabled sustainability use cases.
Climate change is one of the world’s greatest challenges and at AWS we know we have to move fast, constantly innovate, and invest in future technologies and solutions in order to meet our net-zero carbon by 2040 goals. We also know that we can’t do this alone, and look forward to continuously collaborating with Australian organisations, businesses and government for a sustainable AI future in Australia.
* In the study, AWS data was modelled against simulated on-premises data
* On-premises describes IT infrastructure hardware and software applications that exist on-site, for example locally within an organization’s own physical office or space, in contrast to assets that are hosted off-site in the cloud.