Skip to content

Green Tech or Eco Risk? True Generative AI Environment Impact

As AI reshapes our digital landscape, a hidden environmental cost lurks beneath its innovative surface. Can the promise of a tech utopia withstand the ecological price tag? And is this price tag even real?

Photo by Denis Sebastian Tamas / Unsplash

In the world of generative AI, where machines craft symphonies and paint digital masterpieces, lies an untold story of environmental impact. Training these advanced AIs isn't just a feat of engineering. Those are power-hungry processes, guzzling energy at an alarming rate. From the humming data centers to the intricate algorithms, the carbon footprint of AI is becoming impossible to ignore. So let's unravel this complex tapestry of these innovations and their environmental impact.

Generative AI is the new rock star of the tech world. But it's shredding energy resources like there's no tomorrow. This isn't your grandma's calculator’s AI. This is AI on steroids, pumping out everything from slick marketing messages to full-blown blockbuster movies. Gartner predicted that by 2025, about 30% of outbound marketing messages will be crafted by AI, and by 2030, we're talking about AI being the mastermind behind 90% of major films.

Sounds cool. But it comes at a cost: a hefty energy bill. Training something like GPT-3 is like running a transcontinental flight for every person on board – in energy terms. To put it in numbers, training the GPT-3 model gobbled up 1,287 megawatt hours of electricity, equivalent to what 123 gas-guzzling cars would emit in a year.

And it's not just about the energy. These AI brainiacs are thirsty, too. Training GPT-3 in a state-of-the-art data center could chug 700,000 liters of clean water. That’s enough H2O to make 370 BMWs or 320 Teslas.

Now, with AI adoption skyrocketing (a whopping 56% of professionals are using it for work), we're staring down the barrel of a future where AI servers might need as much electricity as entire countries. By 2027, AI could be guzzling 85 to 134 terawatt hours annually – that's Argentina, the Netherlands, and Sweden's yearly electric bill combined.

So, what's the deal? Are we trading our planet's health for smarter chatbots and cooler videos? It's a question worth pondering as generative AI takes the main stage in our digital concert.

Can one AI be better than the other?

In the AI world, it's not all created equal — some models are like gas-guzzling SUVs, while others are more like eco-friendly bicycles. So let's talk power consumption. Picture this: making a fancy AI model with 110 million parameters? That's akin to taking a round-trip transcontinental flight.

But wait, there's more. Making images is the real energy hog in the room. Generating 1,000 images with something like Stable Diffusion XL? That's about as much CO2 as driving 4.1 miles in a gasoline car.

Now, consider the BLOOM model. Similar in size to GPT-3, it's a bit more “diet-conscious”, consuming just 433 MWh to generate 30 tons of CO2eq. That's like swapping a Hummer for a hybrid in terms of energy diet.

So, what's the takeaway? Size matters, but so does efficiency. Using a smarter model architecture and greener data centers can slash that carbon footprint by up to 1,000 times. It's like choosing between a monster truck show and a bike ride in the park. Both get you places, but one leaves a much greener trail.

Ultimately, the type of AI model – be it for video, music, or images – and its specific task significantly influences its energy and data requirements. The key to a greener AI future might lie in optimizing these models for efficiency, not just capability.

Is it just that all tech companies are bad for the environment?

Just a few years ago, everyone was up in arms about how every Google search was supposedly choking Mother Earth. Fast forward to today, and there's a new villain in town – generative AI. But let's not jump on the bandwagon just yet.

Recently, the clever folks at r/aipromptprogramming crunched some numbers (previously crunched by someone else and then redistributed to top media, so it might all be a speculation if you ask me) and concluded that a single GPT query devours 1,567%, or 15 times more energy than a Google search. Sounds dramatic, right? But hold on to your hats because I found there's a twist.

Let's do the math and compare the monthly query volumes of Google and ChatGPT, the most popular Generative AI on the planet.

According to Similarweb in December 2023, chat.openai.com clocked in 1.55 billion visits. If we play with the idea that each visit involved an average of 5 questions, that sums up to about 7.75 billion monthly queries for ChatGPT.

On the other hand, Google, the search engine giant, saw 84.31 billion visitors in the same period. Given that an average person conducts around 3 to 4 searches per visit, this translates to 295.085 billion Google searches per month.

Do the math, and it turns out there are about 38 times more Google searches than ChatGPT queries each month. This gap in query volume brings a new perspective to the highly-publicized energy consumption debate. While a single GPT query might be more energy-intensive than a Google search, the sheer volume of Google searches could potentially make its overall energy consumption significantly higher.

And here's another wrinkle: that infamous number about Google searches consuming 0.3 watt-hours of electricity? That's old news from 2011. Fast forward to today, and Google's been dabbling with generative AI models in its searches. According to Vijay Gadepally, a research scientist at the MIT Lincoln lab, that means the energy usage for a Google search today could be significantly higher.

So, where does this leave us? Are generative AI models like ChatGPT really the new energy-guzzling monsters of the tech world? Or is this just another case of sensationalist finger-pointing, like the Google search debacle?

Truth is, all big tech has a counterargument. Google's searches, ChatGPT's witty banter, they all come at a cost to the environment. But, like any good investigative story, it's not as black and white as it seems. The key here is to look beyond the hype and understand the nuances of tech's environmental footprint.

How to make AI greener?

The energy demands of training generative AI models are substantial, but the tech industry isn't sitting back. They're actively seeking ways to reduce AI's carbon footprint. Let's explore what's been done and what the future holds.

Rethinking AI's energy consumption

Generative AI, powered by GPUs, is known for its high energy consumption, particularly during training phases. These GPUs consume 10-15 times more energy than traditional CPUs, making the training of large generative models notably energy-intensive. However, the process of inference, where models respond to user prompts, is relatively less demanding, offering some energy-saving grace.

Efficiency in AI design

Max Hilsdorf, a prominent data scientist and AI educator, highlights an innovative approach to reducing AI's energy use. He suggests using Mixture of Experts (MoE) models: "Large general-purpose models often outperform smaller task-specific models due to their ability to handle atypical inputs by leveraging cross-domain knowledge. MoE models represent a promising approach to blending performance and energy efficiency. Although trained as one large network, MoE models like Mixtral activate only the most relevant neurons for each request, thereby significantly reducing computational resources while maintaining high accuracy."  This approach aligns with the findings of researchers from Google and UC Berkeley, who demonstrate that the carbon footprint of large language models can be reduced by 100 to 1,000 times with efficient algorithms, hardware, and cloud data centers.

Geographical optimization for AI training

The location of AI training can immensely impact its carbon footprint. For instance, training AI in regions with a higher fraction of carbon-free energy can greatly reduce CO2 emissions. Research shows that strategic scheduling of machine learning workloads can result in up to 10 times less CO2e emissions, depending on the geographic location.

Reusing and fine-tuning AI models

Instead of building new models from the ground up, fine-tuning pre-existing models for new tasks can lead to significant energy savings. For example, Stability AI’s StabilityLM suite offers open-source models adaptable for various uses, reducing the need for new, energy-intensive training sessions. According to an experiment conducted by Accenture, training a smaller "student" model that was only 6% the size of the original "teacher" model resulted in the same level of accuracy (99%) while consuming 2.7 times less energy.

AI in renewable energy optimization

Beyond optimizing its own energy use, generative AI can aid the renewable energy sector. By predicting energy demands or optimizing renewable energy generation, AI can contribute to overall greenhouse gas reduction. This application of AI extends its role from being just an energy user to a facilitator of energy efficiency.

AI's sustainable future

The trajectory of AI's sustainability is multifaceted. From adopting energy-conserving computational methods like TinyML to optimizing content quality for better efficiency, the industry is geared towards a greener future. The focus is not just on minimizing AI's environmental impact but also on leveraging its capabilities to promote broader sustainability goals.

All in all, making AI tech greener is a complex yet achievable goal involving a combination of technological innovation, strategic planning, and a shift in operational paradigms. And many players in the space understand the importance of the task.

So how much of an ecocide is generative AI right now?

Generative AI, much like the rest of big tech, finds itself in a delicate balance between innovation and environmental impact. The world’s wealthiest, including tech giants, often face criticism for their carbon footprint. Yet, they also offer counterarguments focusing not just on their efforts toward sustainability but also on the immense economic value they bring. For instance, the top 1% of emitters contribute over 1000 times more CO2 than the bottom 1%, but they often justify this through substantial investments in climate innovation and economic growth.

Let’s remember Bill Gates. Despite his high personal carbon footprint, has argued that his investment in climate innovation offsets his impact. This narrative is echoed by major tech companies, who argue that their strides in efficiency and renewable energy use mitigate their overall environmental impact.

Similarly, while generative AI's impact on the environment is significant, so is its potential economic contribution. McKinsey's research suggests that generative AI could add between $2.6 trillion to $4.4 trillion annually to the global economy, a figure that cannot be overlooked.

Not to mention, the most widely used generative AI product in the world, ChatGPT, developed by OpenAI, leverages Microsoft Azure's infrastructure for inference, training, and development. And Azure has been carbon neutral since 2012. Furthermore, Microsoft’s substantial investments in data center infrastructure, as OpenAI’s exclusive cloud provider, not only facilitate these sustainable practices

Andrew Van Noy, CEO of DeepPower, encapsulates this balance: "Generative AI's energy consumption is indeed significant, but context matters. Both generative AI and Google's search engine, for instance, have substantial energy footprints but also drive immense value and advancements." Van Noy acknowledges Microsoft's Azure's role in mitigating these impacts, emphasizing that such corporate initiatives towards carbon neutrality are vital for sustainable tech development. "The focus should be on balancing their utility with environmental concerns," he adds.

So, is generative AI as bad for the environment as it's made out to be? The evidence suggests a more nuanced picture. With sustainable data centers and a commitment to net-zero goals, generative AI's current environmental impact may not be as dire as some fear. But here’s another brain teaser: as we embrace generative AI, are we truly safeguarding our planet, or are we merely witnessing a grand illusion where environmental debts are conveniently erased with the stroke of tech money? Makes you wonder.

Latest