What Impact does AI have on Data Centres and Sustainability?

The artificial intelligence boom is increasing the need for computation and data transmission. What are the consequences?

The emergence of generative artificial intelligence (AI) has been the talk of the town in recent months.

The tools supported by this technology are on everyone’s lips and we find news such as the banning of ChaptGPT in Italy, the investigation of this platform by the Spanish Data Protection Agency or the request by Elon Musk and other entrepreneurs and experts in the technology sector to pause the development of AI for 6 months to analyse its risks and rethink where we want to go.

Another by-product of AI development is the increased need for computing capacity and data transmission. “With the development of AI, the volume of data being used has multiplied exponentially. The ability to obtain results in real time is increasingly important, so it is vital to have an infrastructure network with sufficient capillarity to send, receive and process these data close to the place of emission and with minimum latency,” says Ramón Cano, director of Managed Services at Equinix in Spain.

Thus, Matías Sosa, product marketing manager at OVHcloud, considers that “we are facing a situation that presents us with three main challenges”. “Firstly, we must bear in mind that the speed and latency of the internet connection are crucial factors in the transmission of large amounts of data. However, the network infrastructure is not yet sufficiently developed in many parts of the world to meet all the data transmission demands involved in artificial intelligence,” he explains.

Secondly, he points out that “the hardware needed for AI, such as high-performance graphics processors, is relatively expensive today and demand for it continues to rise”. In addition, he notes, “a potential supply shortage could become an added problem”.

Finally, he notes that “the power consumption of AI poses a significant challenge to our societies”. “As the demand for computing capacity continues to increase, the need for energy will also increase, which will not only have cost implications, but also in terms of environmental impact.

Similarly, Federico Vadillo, security expert at Akamai, believes that “the increasing demand for computing resources and the complexity of AI applications may lead to increased infrastructure and energy costs”. In addition, he acknowledges that “the increase in the amount of data to be transmitted can lead to network congestion and security issues”.

For his part, José Antonio Pinilla, CEO of Asseco Spain, says that “as AI models continue to advance and their use increases, the challenge will be to maintain sufficient computing capacity to support them”. In fact, he notes that “there is already talk that lack of computing power is an obstacle to AI development”. “The key will be to turn to supercomputers, new hardware architectures or, directly, to invest in cloud computing infrastructure and data centres,” he adds.

Industry response

The rise of AI poses many challenges, but industry is already responding. “The AI ‘boom’ has certainly led to a significant increase in computing capacity and data transmission needs. To address these needs, the technology industry has been working on improving and developing new technologies, infrastructures, and services,” says Vadillo.

“In terms of computing power, new graphics processing units (GPUs) and tensor processing units (TPUs) have been developed and commercialised that are specifically designed for the acceleration of deep learning and other AI applications. In addition, cloud computing services have been developed and enhanced, allowing businesses and users to access scalable and flexible computing resources without the need to invest in their own infrastructure,” he specified.


In terms of data transmission, he notes that “telcos have been upgrading and expanding their broadband networks and have been investing in new technologies, such as 5G, to offer faster transmission speeds and responsiveness”.

In addition, he recalls that “new technologies are being developed, such as fog computing, which allows data to be processed at the edge of the network, reducing the amount of data that needs to be transmitted through the core network”.

Exponential growth in requirements

The demands placed on the development of AI are very high. “In terms of computing needs, OpenAI estimated already in 2018 an exponential growth of resources for training large models, with resources doubling every 3 to 4 months, and not every two years as ‘Moore’s Law’ had accustomed us to. Consequently, we find ourselves in a stressed market, which needs to continue training models with a high consumption of resources and which demands more power in less time”, specifies OVHcloud’s product marketing manager.

As for data traffic, Sosa cites a Cisco study, which points out that “it could quadruple in the next five years”. “While it is true that this is partly due to the development of the IoT and the increase in devices connected to the internet, the growth of data generated by and for AI models could also have a considerable impact on traffic, possibly leading to an increase in technology investments and network deployments worldwide,” he says.

He also notes that “some studies suggest that demand for compute and data transmission capacity for AI applications could increase significantly in the coming years”. “According to a McKinsey Global Institute report, data traffic is expected to increase by 45% annually through 2025, driven in large part by the growing adoption of AI. In addition, demand for computing capacity for AI applications is expected to increase at a compound annual rate of 25-30% over the next five years.”

Furthermore, Pinilla stresses that “hardware is representing a bottleneck for AI development and for supplying that need for computation and data traffic”. “Traditional computer chips, or central processing units (CPUs), are not well optimised for AI workloads. This results in reduced performance and increased power consumption. For example, the GPT-3 model, one of the largest ever created, has 175 billion parameters and, according to research by Nvidia and Microsoft Research, even if they are able to fit the model on a single GPU, the sheer number of computational operations required can result in excessively long training times. As a result, GPT-3 is estimated to take 288 years on a single Nvidia V100 GPU.

However, Vadillo cautions that “these estimates may vary depending on factors such as the rate of AI adoption, the energy efficiency of the computing technology, and the availability of scalable computing resources.

Environmental impact

The increased requirements needed for AI development may also have an environmental impact.

“If not designed efficiently by companies, AI models can be very energy-intensive, having to process massive volumes of data or run numerous iterations on the data to ensure accuracy and statistical significance in the model results,” says the Equinix manager.

According to Bloomberg, training an AI model can consume more electricity than 100 homes. For example, training GPT-3 required 1,287 gigawatt-hours, according to research published in 2021. This is almost as much electricity as is consumed by 120 homes, as a household spends approximately 10 gigawatt-hours.

“According to the International Energy Agency, it is estimated that by 2030, data centres alone will consume 1,200 terawatt-hours of electricity per year, which is equivalent to the total electricity consumption of Japan and Germany combined,” says the CEO of Asseco Spain.

Similarly, Vadillo notes that “it is estimated that energy consumption by data centres worldwide could increase by 30% in the next decade”, with the greenhouse gas emissions that this entails. “Power generation is a major source of greenhouse gas emissions, so increased energy consumption by AI systems could increase CO2 emissions. According to some studies, CO2 emissions associated with AI use could reach 4 gigatonnes by 2030,” he notes.

Pinilla also notes that “a University of Massachusetts study indicates that the carbon footprint of training an AI model is equivalent to the lifetime emissions of five cars”.


The Akamai expert also points to other unintended consequences, such as increased water consumption. “Data centres require large amounts of water for cooling, which can have a significant impact on local water resources. In addition, water scarcity may limit the location of data centres in some geographic areas”.

In addition, the extraction of raw materials can pose another problem. “The production of electronic equipment and other components for AI systems requires the extraction of raw materials such as metals and minerals, which can have a significant impact on the environment,” says Vadillo.

Finally, he believes that the rise of AI could have an impact on the growth of e-waste. “The rapid obsolescence of electronic equipment and the need to constantly upgrade AI systems can generate large amounts of e-waste, which can be difficult to manage and can have a significant impact on the environment.”

Taking a responsible approach

The technology industry is aware of these challenges and knows it needs to act accordingly. “It is very important to think about sustainability strategy and adopt a responsible approach to AI,” says the Equinix manager.

Likewise, Clarisa Martínez, director of the Data, Analytics and AI Centre of Excellence at Capgemini Spain, stresses that “it is very important to measure and track, with faster and more accurate data, the carbon footprint and sustainability impacts; calculate the carbon footprint of AI models; develop efficient machine learning architectures and optimised systems for training; and increase transparency, including emission measurements, along with performance and accuracy metrics”.

In addition, the Akamai head says we should focus on “encouraging energy efficiency, the use of cleaner technologies, recycling and proper waste management, research and development, and the implementation of more sustainable policies and regulations.

As such, he notes that “measures can be implemented to improve the energy efficiency of data centres and AI systems themselves, such as the use of renewable energy and improved equipment design and cooling”. Similarly, Cano notes that data centres “need to maximise their efficiency in terms of PUE (Power Usage Effectiveness), water consumption and other elements” to cope with the AI boom.

Similarly, Pinilla emphasises the use of “the use of renewable energies such as wind or solar, the use of reused water and the improvement of cooling processes and hardware from an energy point of view” as measures to increase energy efficiency and reduce greenhouse gas emissions.

Vadillo also points out that “cleaner technologies can be used in the production of electronic equipment, reducing the extraction of raw materials and using more sustainable materials”.

In addition, he insists that “recycling and proper management of e-waste should be encouraged, through the establishment of recycling programmes and the promotion of equipment reuse and repair practices”.

On the other hand, it argues that “research and development of new AI technologies that are more energy efficient and use more sustainable materials and resources can be encouraged”.

Asseco’s CEO also speaks of the need to “optimise algorithms so that data processing requires less energy”, as well as “research and develop more advanced AI technologies that consider sustainability from the outset and try to reduce their energy and water consumption and carbon footprint generation”.

In the same vein, Cano believes that “corporate data science teams should strive to design AI models that are as clean and efficient as possible”. For example, he points out that “a good way to limit the carbon footprint of AI models is to identify ways to use only the necessary data that is most relevant, without compromising the quality of the model”.

Finally, the Akamai expert emphasises that “policy makers can establish policies and regulations that encourage the adoption of more sustainable practices in the AI industry, such as incentives for the adoption of renewable energy, the implementation of regulations for the proper management of e-waste, and the promotion of more sustainable practices in the production and design of electronic equipment”.