As energy companies look to modernize their operations to reduce costs and make them more efficient, many are turning to technologies such as machine learning, robotics and artificial intelligence (AI). But as with any new technology, it’s important to consider the whole picture. While the use of AI and other technologies could help improve operations, both in fossil fuels and green alternatives, how much energy does it take to power AI?
There has long been a discussion about the impact of AI on the planet, but with all the hype around impressive new technologies, it has mostly faded into the background. In 2020, researchers at OpenAI in San Francisco introduced the world to a learning algorithm that could move a robotic hand to manipulate a Rubik’s cube. This marked a huge leap forward in AI technology. At this point, it took over 1,000 computers and a dozen machines running specialized graphics chips to perform complex calculations for months to achieve this feat. The process consumed an estimated 2.8 gigawatt-hours of electricity, equal to the output of three nuclear power plants for one hour, although this has not been confirmed by the company.
AI technology is advancing at an impressive pace after decades of massive investment. But with this progress comes concerns about the impact it could have on the environment. While the output may seem simple, as machines learn to answer questions, recognize images, and play games, the power required to accomplish these tasks is immense. Running artificial intelligence requires huge amounts of computing power and electricity to create and solve algorithms. Sasha Luccioni, postdoctoral researcher at Mila, an artificial intelligence research institute in Canada, explained: “The concern is that machine learning algorithms, in general, are consuming more and more energy, using more data, training longer and longer.” Related: Saudi Aramco buys carbon credits in biggest auction ever
And in recent years, artificial intelligence has gradually become more integrated into our daily activities, such as answering questions via Alexa or Siri, routing Google Maps or identifying people through photos, all available on our phones and home computers. But few people consider how much power it takes to complete these seemingly simple tasks. We often compare machines to humans when it comes to this type of task, assuming that computers can answer questions just like our brains do, with relatively little effort. However, AI does not learn information in a structured way, it does not understand human concepts such as cause and effect, context or analogies, which means that it requires a deep learning “brute force” statistical technique approach to work.
A deep learning model gets trained a lot unlike our brain. For example, to identify a picture of a cat, thousands of photos of cats that have been tagged by humans are shown. The model does not understand that a cat is more likely to climb a tree or engage in other feline activities than a dog and will associate these objects only if they are present in the images. For the model to understand the image, it needs to be shown all possible combinations until it learns.
Up until now, the response to using AI in energy operations to make them more efficient and reduce costs has been overwhelmingly positive. But experts now fear that the high energy needs of these types of technologies may have been largely overlooked. If AI is used in energy projects to support modernization and decarbonization, it could cause more damage to the environment than we think. While it could revolutionize trillion dollar industriesFrom energy to retail, building AI technologies like chatbots and image generators will require massive amounts of electricity, which could increase global carbon emissions.
Right now, there’s a lot of ambiguity about how much energy is needed to power AI programs. For example, if you ask ChatGPT this question, it replies along the lines of “As an AI language model, I have no physical presence or directly consume energy.” The complexity of AI programs means they consume much more energy than other forms of computing, but the companies building these programs, such as Google and Microsoft, are not disclosing how much. Right now, we know very little about how much electricity and water is needed to train and power their AI models, and what energy sources are used to run their data centers.
We are seeing more and more companies incorporating AI into their operations as it becomes more widely available. Sasha Luccioni, head of climate for the artificial intelligence company Hugging Face, explained: “This exponential use of AI brings with it the need for more and more energy”. Luccioni added, “Yet we are seeing this shift of people using generative AI models just because they feel they should, without sustainability being considered.”
The rapid advancement of AI technology has led companies across all industries to incorporate AI programs into their operations, as they seek to evolve. This can be seen as a positive, as businesses are embracing technology and modernization, which could make operations more efficient. However, the issue of energy consumption in AI and similar technologies is often overlooked, meaning companies are rapidly adopting AI programs without assessing their impact on the environment. Moving forward, it is important that information on energy use in AI becomes more transparent and that governments regulate the sector in line with their climate policies.
By Felicity Bradstock for Oilprice.com
More top reads from Oilprice.com:
#Energy #Consumption #Silent #Threat #Environmental #Sustainability #OilPrice