Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
info@linkdood.com
Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
info@linkdood.com
There’s been a ton of talk about how cool artificial intelligence (AI) is and the awesome stuff it’s doing, from making healthcare better to creating video games that are seriously next level. But have you ever wondered about the other side of AI – you know, the not-so-great stuff? Just like your laptop or phone, AI needs power to run. But the kicker is, it uses a heck of a lot more than your average gadget. In this article, we’ll talk about how much power AI uses, why it’s a problem, and what we can do to make it better.
Before we jump into the nitty-gritty, we need to talk about cryptocurrency, like Bitcoin. Why? Well, AI and cryptocurrency mining both use something called GPUs (graphic processing units) to work. These GPUs are power hungry and crypto mining became infamous for using up so much power it was like having millions of extra houses on the grid. Not to mention it caused a global chip shortage in 2022. So keep that in mind when we start talking about AI.
Okay, so AI uses the same GPUs, often even more, as cryptocurrency mining. Big AI systems like ChatGPT (which I’m based on) and Google Bard need a whole lot of power to process all the crazy amounts of information they work with. Sasha Luccioni, a person who studies ethical and sustainable AI, thinks we need to talk about this more. If we’re not careful, AI could be like cutting down forests to make room for technology to monitor deforestation.
Here’s the thing though: it’s tough to figure out exactly how much power AI uses. AI companies and chip manufacturers don’t exactly advertise their energy usage, and AI’s impact isn’t as obvious as something like car exhaust. Because AI runs in the background, people often don’t realize the scale of power it uses, which makes it easier to ignore.
Now let’s talk numbers. One study found that to train my older sibling, GPT-3, they used up 3.5 million liters of water. If they used less efficient data centers, it could go up to 5 million liters. As for power, estimates suggest that training GPT-3 used 1,287 MWh of energy and made as much carbon as flying from New York to San Francisco 550 times. And with the arrival of even bigger AI systems like GPT-4, this could go up even more.
So here’s the dilemma: we want AI to keep getting better, but we don’t want to fry the planet. Some people have proposed solutions to make AI greener, but these usually mean the AI isn’t as good. Finding the right balance will be key. If we don’t make some changes, AI could use more energy than all the humans working in the world by 2025. In fact, machine learning and data storage could use up 3.5% of all the world’s electricity by 2030.
We need to treat AI’s impact on the environment like we treat cryptocurrency’s impact. We need to think about how to use AI responsibly without making our planet worse off. More transparency from AI companies about their energy use would help. And experts need to work together to come up with smart solutions that make AI better without using more power.
AI is doing a lot of great stuff. But just like with anything else, we need to think about the cost. If we’re not careful, AI could be a serious drain on our resources. It’s important we figure out exactly how much power AI uses and find ways to make it more sustainable. We can make the most of AI, but we need to do it in a way that won’t wreck our planet.
1. What is the connection between AI and cryptocurrency?
Both AI and cryptocurrency mining use graphics processing units (GPUs) which require a significant amount of power. Just as cryptocurrency mining led to high electricity consumption and a global chip shortage, AI’s extensive use of GPUs can also have substantial environmental implications.
2. How much energy does AI use?
Exact figures are difficult to provide due to a lack of transparency from AI companies and chip manufacturers. However, some studies suggest that training large AI models can consume massive amounts of power. For instance, it is estimated that the training of GPT-3 used 1,287 MWh of energy. With newer and more extensive AI systems like GPT-4, this consumption is expected to rise even further.
3. Why is it difficult to measure AI’s environmental impact?
Measuring AI’s environmental impact is challenging because its operations are intangible and often invisible to end-users, unlike more evident forms of technology like vehicles. Additionally, AI companies and chip manufacturers have been reluctant to share specific energy usage details, making the quantification of AI’s environmental footprint challenging.
4. What can we do to reduce the environmental impact of AI?
We can reduce AI’s environmental impact by demanding greater transparency from AI companies and chip manufacturers regarding their energy usage. This would provide a more accurate assessment of AI’s environmental footprint. In addition, we can encourage researchers and technology experts to devise solutions that optimize AI’s intelligence while minimizing its energy consumption.
5. What does the future look like if we don’t manage AI’s environmental impact?
If we don’t make significant changes, the energy consumption of AI could exceed that of the entire human workforce by 2025. Machine learning training and data storage alone could account for 3.5% of global electricity consumption by 2030. This excessive use of resources emphasizes the importance of making AI more sustainable.
6. Does AI’s water usage also pose an environmental problem?
Yes, AI’s water usage is a significant environmental concern. The cooling systems for the data centers that host AI operations require substantial amounts of water. For example, training GPT-3 reportedly used around 3.5 million liters of water.
Sources The Guardian