Home

/

Ai Newsletter

/

The "AI has an Energy Problem" Edition

The "AI has an Energy Problem" Edition

Marco Andre
Marco Andre
May 28, 2024

Yeah, it's not only good news. I listened to this podcast with Sasha Luccioni, and she explained brilliantly challenges and possible solutions.

Starting with the problems:

- Training a Large Language Model (like Chat-GPT) emits as much carbon as 5 cars in their entire lifetimes. And as much energy as 130 American homes in 1 year.

- Running a prompt on Chat-GPT requires 10x more energy than a Google Search.

And now the possible solutions:

- Development of Small Language Models, which are more energy efficient to train and run (I will write about this in one of the upcoming newsletters).

- Being more aware of the impact of AI models. Sasha created something called Code Carbon - it gives developers and leaders a notion of the environmental impact of each model they choose. She's also working on an Energy Star rating (like you have in your fridge) for Large Language Models, which can be used by you or me to make our choices of which tech to use.

- Thinking of when we really need to use the most recent tech (for instance to do calculations), versus using Excel - if it's fit for purpose. Think using a fan versus turning the air-con for when it's really hot.

Why is this relevant? It's not intended to be yet one more argument against AI.

But as I always mention, we need to be aware of the good, the bad, and the ugly.

And as my friend Claudio Truzzi, MBA, PhD mentioned, there's also energy efficiency gains by doing things faster. For example, the fact that certain human and machine tasks take way less time with AI, can lead to net savings on energy.

We need to continue asking the right questions.

So we can make AI work for us.

See you next week.