WSEAS Transactions on Systems
Print ISSN: 1109-2777, E-ISSN: 2224-2678
Volume 23, 2024
The Escalating AI’s Energy Demands and the Imperative Need for Sustainable Solutions
Author:
Abstract: Large Language Models (LLMs), such as GPT-4, represent a significant advancement in contemporary Artificial Intelligence (AI), demonstrating remarkable natural language processing, customer service automation, and knowledge representation capabilities. However, these advancements come with substantial energy costs. The training and deployment of LLMs require extensive computational resources, leading to escalating energy consumption and environmental impacts. This paper explores the driving factors behind the high energy demands of LLMs through the lens of the Technology Environment Organization (TEO) framework, assesses their ecological implications, and proposes sustainable strategies for mitigating these challenges. Specifically, we explore algorithmic improvements, hardware innovations, renewable energy adoption, and decentralized approaches to AI training and deployment. Our findings contribute to the literature on sustainable AI and provide actionable insights for industry stakeholders and policymakers.
Search Articles
Keywords: Artificial Intelligence (AI), energy consumption, environmental impact, Large Language Models
(LLMs), policy regulation, renewable energy, sustainable AI, and Technology Environment Organization (TEO)
Pages: 444-457
DOI: 10.37394/23202.2024.23.46