Canada isn’t taking the escalating environmental impact of AI seriously

The US is considering stringent data centre rules to keep AI's impact on water, land and carbon in check. But the recent Liberal budget was silent on the environmental elephant in the room.

Start

It’s a big bet on an industry that’s ballooning like a supernova.

In the 2024 federal budget introduced on April 16, Canada’s Liberal government earmarked $2.4 billion in new supports for the artificial intelligence sector – everything from upgrades, to infrastructure that can handle all the new computing power, to assists for start-ups and money for workers displaced by AI.

Conspicuously absent from their fiscal plan, however, was any reference to measures that will mitigate the environmental impact of AI, which, as Morgan Stanley predicted in early April, is triggering tipping-point growth in construction of the massive energy- and water-intensive data centres that are needed to deliver all this new computing heft.

That connection, however, was not lost on U.S. Senator Ed Markey, a veteran Democrat and one of the co-authors of the Green New Deal. He introduced legislation earlier this year proposing stringent environmental rules designed to check the escalating carbon, water and land impact of the data centres required to handle the vastly more intensive processing requirements of applications powered by generative AI and large language models – everything from customer chatbots and AI co-pilots to AI-powered image engines, like DALL-E.

Markey’s bill, which has yet to be passed, proposes that the Environmental Protection Agency undertake a detailed assessment of the full range of environmental impacts of AI, including energy, water, and e-waste from servers, and also recommends the development of a voluntary disclosure/reporting system for firms that use AI.

The tech sector’s reliance on data centres is not a new story. Until recently, these facilities were estimated to consume about 1% of the world’s daily diet of electricity, and the United States is home to about a third of all data centres. The International Energy Agency estimates that data centres and the associated transmission infrastructure yielded about 330 megatonnes of carbon dioxide in 2020, or about 1% of energy-related greenhouse gases, because much of the electricity comes from fossil fuel power generation. What’s more, data centres also require huge quantities of water as coolant, and all that consumption triggers even more electricity use. (In data centres, water is piped into and through chillers, pipes, heat exchangers and humidification systems to reduce the heat generated by high-powered servers; the spent coolant then needs to be purified before being returned to the environment.)

AI, however, has turbocharged these impacts. The rapid uptake of applications like OpenAI’s ChatGPT has triggered spikes in the need for new data centres fitted out with the super-powered processors and servers manufactured by Nvidia, the Silicon Valley chipmaker that has emerged as a major enabling player in the AI revolution. Training large language models on seemingly endless supplies of online data is a highly energy-intensive process, but so too are all the searches, prompts and questions enabled by these applications.

“This accelerated development raises concerns about the electricity consumption and potential environmental impact of AI and data centers,” warned Alex de Vries, the founder of Digiconomist, in a 2023 paper published in Joule, an energy journal.

Environmentalists are concerned that the AI-driven surge in electricity and water consumption essentially creates an unwelcome source of competition for renewable power at a time when record high demand has placed enormous pressure on the existing system.

“In a moment when we need to be electrifying our transportation and heating and cooling in all sorts of industries, we’re adding on more and more use from AI and big data centres,” says Erik Kojola, a senior climate research specialist at Greenpeace USA. “For us, this is a major concern because we need to be driving increased renewables towards the things that we really need to electrify. If we’re just increasing demand and load at a faster pace than renewables, then we’re not decarbonizing the grid. Our worry is that this is going to offset any improvements that we’re getting in renewable generation and [create] a setback in our efforts to decarbonize.”

There’s a whole host of social and political questions about AI. But on the climate front and energy front, [we need] to put some guard rails on it.

 

- Erik Kojola, senior climate research specialist at Greenpeace USA

The numbers are bracing. A 2023 research paper published by a University of California, Riverside, team estimated that “global AI demand may be accountable for 4.2 – 6.6 billion cubic meters of water withdrawal in 2027, which is more than the total annual water withdrawal of . . . half of the United Kingdom.” De Vries’s findings, in turn, indicate that an AI-powered internet search – which is becoming increasingly commonplace – uses 10 times as much power as a conventional one. Google is now responding to some searches with AI tools, and the various versions of ChatGPT quickly attracted more than 100 million users.

Of course, the impact of AI on the data-centre industry has a precursor, which is cryptocurrency, and the extraordinarily energy-intensive process involved in bitcoin (or equivalent) mining. Another de Vries study, published in Joule in early 2024, estimated that the annual water footprint of U.S.-based bitcoin mining was equivalent to the usage of a city the size of Washington, D.C. Moreover, much of the growth in crypto-related water consumption is taking place in Russia and Kazakhstan, which are not known for their environmental controls.

It’s worth pointing out that the largest developers of data centres – giants like Alphabet, Facebook and Amazon – aren’t newcomers to hard questions about their environmental footprints. In some cases, they’ve located these facilities in areas with ready access to water. However, data centres have also been situated in regions where they’re competing with other users for reserves in depleted water tables. Giant tech firms have also established power-purchase agreements with solar and wind farms as a means of decarbonizing their electricity consumption habits and managing the Scope 1 and 2 emissions associated with data-centre operations.

* * *

The accelerating growth pattern in just the past few years suggests that the environmental footprint of AI ­– together with those of more conventional data-storage services ­– could rapidly balloon into a carbon nightmare virtually unchecked by conventional constraints on consumer behaviour; after all, the end users don’t pay for much of this processing power. Kojola argues that governments need to step in rather than simply assume that the data-centre sector will sort out these issues using technology improvements and voluntary reporting measures.

“We’re kind of in the earlier stages of this technology, and now’s the time to get in and put in regulations to help direct it in a way that is socially beneficial,” he says. “There’s a whole host of social and political questions about AI. But on the climate front and energy front, [we need] to put some guard rails on it so that [the technology] is developing in a way that takes into account the energy use and water use and its other environmental impacts.”

John Lorinc is a Toronto-based journalist and author specializing in urban issues, business and culture. His most recent book is Dream States: Smart Cities, Technology, and the Pursuit of Urban Utopia. 

Latest from Climate

SUBSCRIBE TO OUR WEEKLY NEWSLETTER

Get the latest sustainable economy news delivered to your inbox.