While the health care applications look promising, there are some possibly severe side effects to the environment thanks to AI’s high power consumption
Artificial intelligence has shown a lot of promise in health care, with tests showing that it can do some tasks better than doctors and assist them in doing others. However, a report in the journal Joule shows that if AI is widely adopted, it could have an energy footprint so large that it would exceed the power demands of some countries.
The report notes that as the demand for AI service continues to grow, the energy consumption for it will significantly increase. AI has already seen rapid growth since 2022. Training AI requires the software being fed large amounts of data, which is an energy-intensive process. For example, according to the report, Hugging Face, an AI-developing company based in New York, reported that its multilingual text-generating AI tool consumed about 433 megawatt-hours (MWH) during training, enough to power 40 average American homes for a year.
The energy consumption of AI is not only high with training, but the report states that when it is working to generate data based on prompts, it uses a significant amount of computing power and energy. The commonly used ChatGPT could cost 564 MWh of electricity a day to run, according to researchers.
The report states that even though companies are working on improving the efficiencies of AI hardware and software to reduce the energy requirements, these improved efficiencies can increase overall demand. The more the technology advances, the more resources could be used.
The report points to Google as an example. The search engine has been incorporating generative AI in the company’s email service and is testing out powering its search engine with AI. The company processes up to nine billion searches a day, and based on the data, researchers estimate that if every Google search uses AI, it would need about 29.2 TWh of power a year, which is equivalent to the annual electricity consumption of Ireland.
For now, this scenario is unlikely because of the high costs for additional AI servers and their limited availability. However, researchers note that the production of AI servers is projected to grow rapidly in the near future. The report notes that by 2027, worldwide AI-related electricity consumption could increase by 85 to 134 TWh annually based on the projection of AI server production – an amount comparable to the annual electricity consumption of countries such as the Netherlands, Argentina, and Sweden. In addition, improvements in AI efficiency could also enable developers to repurpose some computer processing chips for AI use, which could further increase AI-related electricity consumption.
Researchers state that a careful approach to what AI is used for is required, and to better manage energy consumption, AI shouldn’t be in things where it isn’t actually needed.