Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
[email protected]
Address
33-17, Q Sentral.
2A, Jalan Stesen Sentral 2, Kuala Lumpur Sentral,
50470 Federal Territory of Kuala Lumpur
Contact
+603-2701-3606
[email protected]
Artificial intelligence (AI) technology is growing rapidly, but so is its need for energy. The newer, larger AI models like those used in chatbots and recommendation systems require tons of memory and computing power. This heavy demand results in big energy usage, often from sources that aren’t great for the environment. Data centers, which house all this technology, use a lot of electricity and water, raising concerns about AI’s impact on our planet.
As AI technology continues to grow, it’s important to make sure it does so in a way that doesn’t hurt our environment. Companies are starting to realize this and are looking for ways to make AI more energy-efficient.
AMD, a big player in the tech industry, has set a goal to make its computing systems 30 times more energy-efficient by 2025. This plan, known as “30×25,” is AMD’s commitment to better energy use. By 2024, they’ve already made their systems 13.5 times more efficient, thanks to their new tech like AMD Instinct MI300A APUs and 4th Gen EPYC “Genoa” CPUs. They’ve come a long way, but they still have more to do to hit that 30x target.
From past successes like their “25×20” initiative, where they exceeded their goal by making mobile processors 31.7 times more efficient by 2020, AMD has shown they can achieve tough targets. This gives us hope that they’ll reach their new efficiency goals too.
AMD’s strategy goes beyond just making better hardware. They are looking at the whole setup—improving how CPUs, GPUs (graphics processing units), and other parts work together and tweaking the software that runs on them. They focus a lot on reducing the energy it takes to shuffle data around inside these systems because as AI models get bigger, moving data uses up a lot of power.
AMD’s work has not gone unnoticed. Their technology powers some of the world’s most energy-efficient supercomputers, earning them spots on the GREEN500 list. For example, the Frontier Test and Development System at Oak Ridge National Labs, which uses AMD tech, was once the top-ranked system on this list, highlighting AMD’s ability to balance high performance with energy efficiency.
AMD is working on new EPYC server processors and MI325X accelerators that will likely push the boundaries of what’s possible in energy efficiency. These upcoming technologies, along with better software, are expected to help AMD meet—and possibly exceed—their 30×25 efficiency goal.
As AI continues to evolve, we might see more AI processing being done on local devices like smartphones and smart appliances, instead of in large data centers. This approach, known as edge computing, could help reduce the energy used by AI systems.
AMD is preparing for this future by making sure their technology works well in these smaller, low-energy devices. Their focus on energy efficiency will be key as AI starts moving from big data centers to the edge, closer to where it’s needed.
Learn about AMD’s 30×25 initiative and how they are working to make AI less power-hungry, benefiting both our tech and our planet.
1. What is AMD’s 30×25 initiative?
AMD’s 30×25 initiative is a commitment to increase the energy efficiency of their high-performance computing platforms by 30 times by the year 2025. This goal focuses on reducing the environmental impact of AI technologies and improving the overall energy use of computing systems that power AI applications.
2. How has AMD progressed towards their energy efficiency goals?
As of 2024, AMD has achieved a 13.5x improvement in energy efficiency. This progress is partly due to advancements in their hardware, such as the AMD Instinct MI300A APUs and 4th Gen EPYC “Genoa” CPUs. AMD’s past success in surpassing their “25×20” goal, where they improved the efficiency of mobile processors by 31.7 times, also showcases their ability to meet challenging targets.
3. Why is reducing data movement important in AI systems?
In AI systems, a significant amount of energy is used to move data between different components like GPUs, memory, and processors. As AI models grow in size, this data movement consumes more energy. By reducing the need for data to travel long distances within the system, AMD can decrease the overall energy consumption, making the systems more efficient and environmentally friendly.
Sources Forbes