What is the energy footprint of training a large language model?
Do you know that generating one image using AI can use almost as much energy as charging your smartphone?
In fact, generating an image using a powerful AI model takes as much energy as fully charging your smartphone, according to a new study by researchers at the AI startup Hugging Face and Carnegie Mellon University. However, they found that using an AI model to generate text is significantly less energy-intensive. Creating text 1,000 times only uses as much energy as 16% of a full smartphone charge.
The study is the first time researchers have calculated the carbon emissions caused by using an AI model for different tasks, says Sasha Luccioni, an AI researcher at Hugging Face who led the work. She hopes understanding these emissions could help us make informed decisions about how to use AI in a more planet-friendly way.
Luccioni and her team looked at the emissions associated with 10 popular AI tasks on the Hugging Face platform, such as question answering, text generation, image classification, captioning, and image generation. They ran the experiments on 88 different models. For each of the tasks, such as text generation, Luccioni ran 1,000 prompts, and measured the energy used with a tool she developed called Code Carbon. Code Carbon makes these calculations by looking at the energy the computer consumes while running the model. The team also calculated the emissions generated by doing these tasks using eight generative models, which were trained to do different tasks.
Generating images was by far the most energy- and carbon-intensive AI-based task. Generating 1,000 images with a powerful AI model, such as Stable Diffusion XL, is responsible for roughly as much carbon dioxide as driving the equivalent of 4.1 miles in an average gasoline-powered car. In contrast, the least carbon-intensive text generation model they examined was responsible for as much CO2 as driving 0.0006 miles in a similar vehicle.
Training, in particular, is extremely energy intensive, consuming much more electricity than traditional data center activities.
Imagine a giant, super-powered brain constantly learning and making connections. That’s kind of what a large language model (LLM) is like. To train these powerhouses, we feed them massive amounts of data, like text and code, and they use complex algorithms to learn from it. It’s like teaching a baby with a million flashcards at once, and it takes a lot of energy to keep that little (or not so little) brain hummin.
Training a large language model like GPT-3, for example, is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power as consumed annually by 130 US homes. To put that in context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. That means you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.
But it’s difficult to say how a figure like this applies to current state-of-the-art systems. The energy consumption could be bigger, because AI models have been steadily trending upward in size for years and bigger models require more energy. On the other hand, companies might be using some of the proven methods to make these systems more energy efficient — which would dampen the upward trend of energy costs.
But before we all go unplug our Alexa, let’s take a step back. It’s important to remember that AI is still a young field, and researchers are constantly working on ways to make it more energy-efficient. Some cool solutions include:
- Using specialized hardware: Just like using the right tool for the job, creating special chips designed for AI tasks can significantly reduce energy consumption.
- Optimizing algorithms: Think of it like being more fuel-efficient with your car. By fine-tuning the algorithms used in LLMs, we can get the same results with less power.
- Using renewable energy sources: Powering AI systems with solar panels, wind turbines, and other clean sources can significantly reduce their environmental impact.
So, while AI does have an energy footprint, it’s not all doom and gloom. Just like any technology, it’s important to be aware of its impact and work towards making it more sustainable. But hey, at least AI isn’t leaving the lights on in every room like your teenager! Remember, it’s all about finding the balance between innovation and responsibility, and AI is no different!