The rapid growth of AI has changed many industries and led to amazing new technology, but it also comes with a big problem: a huge increase in energy use. This rise in energy consumption isn't just a tech issue; it's a big environmental concern that everyone in the AI field needs to tackle. As AI continues to develop, it's not just about making smarter models and solving more users’ problems; it's about making sure these advancements don't harm our planet. The question isn't just about what AI can do for us, but how we can ensure its advancements are sustainable for the planet.
The current landscape
Gartner predicts that without sustainable AI practices, by 2025, AI will consume more energy than the human workforce, significantly offsetting carbon-zero gains.
According to a recent report from the Federal Energy Regulatory Commission, data center demand in the US is expected to reach 35 gigawatts by 2030, that’s the equivalent of powering about 26 million homes. (For context, 1 GW is enough energy to power about 750,000 homes.
In regions like Salt Lake City where gigantic energy users including Meta and Google are building data centers, there has been a noticeable shift back to coal as more data centers are needed to support AI workloads. Plans to retire coal-fired power plants early are being abandoned, pushing the dates as far back as 2042 and dialing back on clean energy resources.
This is a concerning shift that underscores the complex trade-offs between technological advancement and sustainability, especially as AI is on its way to driving a 160% increase in data center power demand by 2030.
While some tech giants like Google, Amazon, and Microsoft have committed to powering their data centers with 100% renewable energy by 2030, the current landscape still sees significant carbon footprints from AI operations.
Based on public data from Meta, one of its data centers in Iowa uses the annual equivalent amount of power as 7 million laptops running eight hours every day.
According to a study from Hugging Face and Carnegie Mellon University, creating an image using generative AI takes as much energy as fully charging your smartphone.
ChatGPT queries consume nearly 10 times the amount of electricity as a Google search. For a startup, training its AI models in the US consumes ~1000 tons of CO2 in a year — the equivalent of 1000 Paris to NYC trips.
AI needs an energy breakthrough. The industry is exploring solutions like atomic fusion to speed up the energy transition away from fossil fuels, but until this energy breakthrough happens, people and businesses in AI need to take individual steps toward change.
Why the AI industry has been slow to adopt sustainable practices
AI-led businesses face challenges in the areas of technology, financial investment, and stakeholder engagement when trying to adopt sustainable AI practices.
Transitioning to sustainable AI solutions often requires substantial upfront investment in energy-efficient technologies and renewable energy sources. According to an IBM sustainability study, while the majority of executives (76%) agree that sustainability is central to their business, nearly half (47%) struggle to fund sustainability investments.
Furthermore, only 31% of organizations report integrating sustainability data extensively into their operational improvements, indicating a gap between sustainability goals and actionable steps.
The shift toward green data centers and sustainable hardware requires not only capital but also a strategic overhaul of existing infrastructures. Companies building AI have complex decisions to make about upgrading to more efficient systems while managing ongoing operational costs.
This, plus the rapid pace of technological change, can make it difficult for businesses to keep up. Many AI companies in early development stages may deprioritize sustainability due to the immediate pressures of competition, technological development, and finding product-market fit.
But as the demand for AI grows, it's becoming essential for businesses to integrate sustainability into their decision-making processes to achieve environmental goals and drive innovation.
How we can all shape the future of AI sustainability
The entire industry plays a part in influencing a more sustainable future for AI. The readiness, adoption, and development of green AI practices depend on the maturity of the market and stakeholder involvement.
Venture capitalists can evaluate the environmental impact of their portfolio, request impact statements from companies, and share sustainability best practices to inspire more businesses to take action.
Enterprise companies and SMB’s using AI can request environmental impact statements from providers to evaluate sustainability efforts and commitment.
Companies developing AI products can be selective about what type of AI model they use. Recent studies show that specialized AI models consume less energy than general purpose AI models. The more frugal it is, the faster it can execute, improving user experience and reducing energy consumption.
Companies developing AI models can partner with green data centers like Genesis Cloud to leverage renewable energy sources and minimize environmental impact. They can ask cloud providers for the Power Usage Effectiveness (PUE) scores of their data centers and even use an open source tool to measure their cloud carbon footprint. Internally, they can develop more frugal specialist AI models to lower carbon emissions. Externally, they can publish their model CO2 emissions like Meta did for Llama 3.1.
Cloud providers like Amazon Web Services (AWS), Google Cloud, Scaleway, and Genesis can help reduce the carbon footprint of AI by developing infrastructures that maximize energy efficiency and by being transparent. This involves sharing their PUE scores including the energy consumed to cool the data center and the CO2 emissions to build the data center, and potentially offering green pricing options. Data centers can also relay the demand for energy efficient chips to hardware providers.
Hardware providers can develop energy efficient chips like NVIDIA that claims its new “superchip” can boost performance for generative AI tasks by 30 times while consuming 25 times less energy with new chip cooling techniques.
Public funding can participate by integrating carbon footprint assessments into decision-making processes.
Regulators can evolve toward holding all players in the AI ecosystem accountable.
As an industry, we need to take a systemic approach of shared responsibility to reduce the environmental impact of AI at all levels of the ecosystem.
Change can happen now
Keeping up with today’s fast paced AI innovation is crucial for businesses to stay competitive, but with the market maturing, sustainability should play a bigger part in the decision making process.
Take the first step by choosing an AI provider that’s already taking action to reduce its energy consumption — ask for their environmental impact statements or if they measure their company’s carbon footprint.
It’s up to all of us to lead the way and advocate for a more sustainable future for AI.
We've featured the best green web hosting.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro