site logo

Ad

Ad

Environmental Cost of Artificial Intelligence

author-img
|
Updated on: 10-Mar-2023 08:10 AM
share-icon

Follow Us:

insta-icon
total-views-icon

4,076 views


Artificial intelligence has gained significant attention in the tech industry as it is expected to revolutionize several trillion-dollar industries such as retail and medicine. However, the development of each new chatbot or image generator consumes a substantial amount of electricity, contributing to a considerable and increasing amount of carbon emissions that contribute to global warming.

Global Warming

Major companies like Microsoft Corp., Alphabet Inc's Google, and the creators of ChatGPT, OpenAI, use cloud computing that relies on thousands of chips in servers within massive data centers across the world to train AI algorithms. These algorithms, called models, analyze data to assist them in learning to perform tasks. The success of ChatGPT has led other companies to compete in releasing their own rival AI systems and chatbots, or develop products that utilize large AI models to provide features to various users, such as Instacart shoppers, Snap users, and CFOs.

Ai

Compared to other types of computing, Al consumes more energy, and creating one model can consume more electricity than the total consumption of 100 US homes in a year. Despite this, due to the rapid growth of the industry and a lack of transparency, the total amount of electricity consumption and carbon emissions resulting from Al remains unknown. Additionally, the emissions can vary greatly depending on the type of power plants providing the electricity. For example, data centers that use electricity from coal or natural gas-fired plants will have significantly higher emissions than those that use power from solar or wind farms.

Although the energy usage and carbon emissions from creating a single model have been calculated by researchers, and some companies have disclosed their energy consumption, there is no comprehensive estimation for the overall amount of power that AI technology uses. Sasha Luccioni, a researcher at Hugging Face Inc. - an AI company, has published a paper measuring the carbon footprint of their product BLOOM, which competes with OpenAI's GPT-3. Additionally, she has attempted to estimate the same for OpenAI's popular ChatGPT using a restricted set of publicly accessible data.

More Transparency

Transparency regarding the power consumption and emissions of Al models is crucial, according to researchers such as Luccioni. With this information, governments and companies can weigh the benefits of using large models such as GPT-3 for medical research or language preservation against the environmental cost. Conversely, using Al for frivolous tasks like writing rejected Seinfeld scripts or finding Waldo may not be justifiable. Nonetheless, increased transparency could also lead to greater scrutiny, as seen in the crypto industry, where Bitcoin's significant energy consumption has attracted criticism, and some nations have banned or restricted fossil-fuel-powered crypto-mining.

Pledges for Net-Zero Emissions

Although Al models are growing in size, Al companies such as Microsoft, Google and Amazon are continually working on enhancements to make them operate more efficiently. The three largest US cloud companies have all pledged to be carbon negative or carbon neutral. Google plans to achieve net-zero emissions across its operations by 2030, and its aim is to run its data centers and offices entirely on carbon-free energy. The company also uses Al to enhance energy efficiency in its data centers, where Al technology directly controls cooling.

OpenAl has made its application programming interface for ChatGPT more efficient, thus reducing electricity usage and costs for its customers. The company runs on Azure and works in collaboration with Microsoft to enhance efficiency and reduce its footprint when running large language models. Similarly, Microsoft is investing in renewable energy and taking other measures to reach its goal of being carbon negative by 2030. It is also conducting research to measure the energy usage and carbon impact of Al and devising ways to make large systems more efficient, both during training and application.

However, companies are generally reluctant to disclose the models they use and the amount of carbon they emit. To make Al run more efficiently, developers or data centre could schedule training at times when power is cheaper or at a surplus. Al firm that train their models when power is in surplus could also advertise their green credentials as a selling point. "It can be a carrot for them to show that they're acting responsibly and acting green," said Ben Hertz-Shargel, an energy consultant at Wood Mackenzie.

Follow Us:

insta-iconlinkedin-iconfacebook-icon

Ad

Ad