AI systems, particularly those involving generative AI and large language models, are highly energy-intensive:
– Training Large Language Models: Training models like Generative Pre-trained Transformer 3 (GPT-3) and GPT-4 requires substantial amounts of energy. For instance, training GPT-3 is estimated to use nearly 1,300 megawatt-hours (MWh) of electricity, equivalent to the annual power consumption of about 130 US homes. GPT-4, being more advanced, uses approximately 50 times more electricity than GPT-.
– Operational Energy Demand: Running AI models like ChatGPT also consumes significant energy. A single ChatGPT request uses about 2.9 watt-hours of electricity, which is roughly 10 times more than a standard Google search.
– Data Centre Energy Use: The increasing demand for AI is driving the growth of data centres, which are major consumers of electricity. Data centres, along with cryptocurrencies and AI, are projected to account for almost 2% of global electricity consumption by 2027, equivalent to the energy consumption of a small country like the Netherlands.
Despite the high energy costs, several strategies are being implemented to optimise AI processes for energy efficiency:
Reducing Computational Load: Advanced video compression techniques can reduce the computational load required for processing and analysing video data. This can lead to lower energy consumption without compromising the quality of the output. For example, using more efficient codecs can significantly reduce the amount of data that needs to be processed, thereby lowering the energy requirements.
Scaling Transformer Architectures
– Optical Neural-Network Accelerators: Scaling transformer architectures effectively can be achieved through the use of optical neural-network accelerators. These accelerators offer a significant energy-efficiency advantage over traditional digital-electronic processors. Studies suggest that optical hardware could achieve a 100× to 8,000× energy-efficiency advantage when running large transformer models, depending on the scale and design of the hardware.
– Model Optimisation: Optimising transformer models through techniques such as model compression, quantization, and the use of learnable weighting feature attention matrices can also reduce energy consumption. These methods allow for more efficient use of computational resources, leading to lower energy demands without sacrificing performance.
The energy costs associated with running AI systems are substantial, but ongoing efforts to optimise these processes are promising. By leveraging advanced video compression techniques, scaling transformer architectures effectively, and using AI for energy management, it is possible to reduce the energy consumption of AI systems while maintaining their performance. These strategies are crucial as the demand for AI continues to grow, ensuring that the benefits of AI are realised without exacerbating environmental and energy challenges.
I think AI has been life changing. One of my friends referred to it as a “calculator” the other day. We still need to be mindful of the energy it requires and honestly, how delightful it would be to power it with the sun?!
References
World Economic Forum. “AI and energy: Will AI reduce emissions or increase demand?” (https://www.weforum.org/stories/2024/07/generative-ai-energy-emissions/)
FEPBL.”REVIEWING THE ROLE OF ARTIFICIAL INTELLIGENCE IN…” (https://fepbl.com/index.php/estj/article/view/1015)
PMC. “Transformers for Energy Forecast – PMC” (https://pmc.ncbi.nlm.nih.gov/articles/PMC10422371/)
Polytechnique Insights.** “Generative AI: energy consumption soars” [https://www.polytechnique-insights.com/en/columns/energy/generative-ai-energy-consumption-soars/](https://www.polytechnique-insights.com/en/columns/energy/generative-ai-energy-consumption-soars/)
Praxie. “From Waste to Optimization: AIs Impact on Energy Efficiency in Manufacturing” [https://praxie.com/ai-for-energy-efficiency-in-manufacturing/](https://praxie.com/ai-for-energy-efficiency-in-manufacturing/)
OpenReview. “Scaling of Optical Transformers”[https://openreview.net/forum?id=fm8p9MDkd9](https://openreview.net/forum?id=fm8p9MDkd9)