If you’re a CIO focusing on the long-term sustainability of your investments and you are allocating to managers that use artificial intelligence, then you need to consider a crucial question: Do the benefits flowing to you (in the form of better risk-adjusted, after-fee returns) outweigh the systemic social and environmental impacts of the AI technology? To address this question, you’ll need to ask your managers to quantify their technology’s environmental and social impact, determine how much AI contributes to performance gains, and demonstrate how related costs and benefits flow back to investors.
We’ll admit upfront that measuring and assessing AI’s environmental and social impact can present challenges due to the technology’s diverse applications and effects. However, this complexity mirrors the sustainability data hurdles allocators started navigating over a decade ago. Though daunting, developing meaningful metrics for tracking impact and framing nuanced investment decisions remains achievable.
As investment managers are increasingly integrating generative AI and large language models (LLMs) like ChatGPT and Claude into their operations and investment strategies, we’ll focus on these technologies — while noting that our analysis extends to AI more broadly. This overview of environmental and social impacts, which includes energy use, greenhouse gas emissions, grid stress, water usage, and labor and human rights issues, provides a foundation for allocators’ cost-benefit analyses — and one that allocators can further apply to their internal use of AI.
Energy Use
From an environmental perspective, the widespread adoption of LLMs has led to significant computational demands, as both training and operating these AI systems require substantial electricity to power that processing.
As a point of reference, a Google search consumes 0.3 watt-hours of electricity, while a ChatGPT query uses 2.9 watt-hours — nearly ten times as much electricity, according to the International Energy Agency.
The widespread adoption of LLMs has dramatically increased demand for energy-intensive data centers, which house the computing infrastructure required to run these computationally demanding models.
To meet this surge in demand, companies are building more data centers. The number of colocation data centers and, more importantly, hyperscale data centers (which support massive data and cloud computing operations LLMs require for training and inference) is growing exponentially and is expected to increase at a 23 percent compound annual growth rate through 2030.
While data centers serve multiple industries and store everyday digital content like websites and photos, AI is the main driver of data center capacity growth, with hyperscalers planning massive expansion plans to support customer demand. Microsoft and OpenAI’s plans to build a $100 billion data center is one example. McKinsey estimates that by 2030 the AI workload would constitute about 70 percent of total data center demand.
Stress on the Energy Grid
This expansion will strain the capacity of the current energy grid and require massive new energy generation. The planned Microsoft-Open AI data center, for example, may require five gigawatts of power, or roughly the equivalent of five nuclear reactors.
According to a December 2024 Lawrence Berkeley National Laboratory report, U.S. data center electricity consumption could surge from its current 2.5 percent share to 7.5 percent of total usage by 2030. This projected growth poses significant challenges for power grid capacity. Real estate and construction industry forecasts indicate that peak summer demand in 2024 already reached 94 percent of permitted capacity. Looking ahead, when factoring in data center expansion plans alongside scheduled power plant retirements and planned capacity additions, power demand could exceed available supply by 2033.
So, where will all the new energy come from? And will the generation be able to keep pace with demand? According to the U.S. Department of Energy, hyperscale facilities are already “stretching the capacity of local grids to deliver and supply power at that pace.” Bloomberg reports that in the densest data center markets the imbalance of supply and demand for power is “leading to years-long waits for businesses to access the grid as well as growing concerns of outages and price increases.”
Goldman Sachs estimates that about $720 billion of grid spending through 2030 may be needed. A separate Goldman report on data center power notes that “these transmission projects can take several years to permit, and then several more to build, creating another potential bottleneck for data center growth if the regions are not proactive about this given the lead time.”
Surging Emissions
The surge in data center energy consumption carries significant environmental implications, with greenhouse gas (GHG) emissions being the most direct impact. Despite advances in renewable energy, the technology’s current limitations — including intermittency issues, storage constraints, and geographic restrictions — mean that fossil fuels continue to dominate data center power supply. This reliance stems from data centers’ fundamental need for consistent and immediately available power. Recent academic research reveals that data centers derive over 56 percent of their electricity from fossil fuel sources, resulting in over 105 million tons of CO2 equivalent emissions — approximately 2.18 percent of total U.S. emissions in 2023. To put this scale in perspective, U.S. commercial aviation produces about 131 million metric tons of emissions per year, making data centers’ carbon footprint nearly comparable to that of the entire airline industry.
The trend of rising GHGs is already well underway, with new research showing that carbon emissions from data centers in the U.S. have tripled since 2018.
In response to this and other trends, hyperscalers are seeking low- and no-carbon energy alternatives. For example, last year Microsoft struck a $10 billion deal with Brookfield Asset Management to develop 10.5 GW of new clean energy capacity in the U.S. and Europe between 2026 and 2030, while Amazon signed multiple partnership deals to develop and deploy 5 gigawatts of nuclear energy over the next 15 years to power its operations. Of note, a recent Morgan Stanley survey of U.S. and European data centers found 96 percent of respondents likely to consider nuclear power contracts as part of their clean power procurement policy.
However, this shift to low- or zero-carbon energy sources will take years, and leading experts estimate that 60 percent of the new demand for data center power generation capacity will be met by fossil fuels, specifically natural gas. In fact, AI’s growth has even been called the “savior” of the natural gas generation business. Oil companies like Exxon and Chevron are gearing up to meet this demand and are signing power purchase agreements with data centers. The result of which, one commentator estimates, could “lead to additional demand of between 3 to 6 billion cubic feet of gas per day by 2030 — equivalent to the gas consumption of the entire state of Florida.” We’ll note that although natural gas produces fewer emissions than coal, it remains a significant source of GHG emissions worldwide. The UN indicates that to align with the Paris Agreement’s goals, global gas plant emissions must be reduced by 28 percent to meet the 2°C target and 42 percent to achieve the 1.5°C threshold by 2030.
Hyperscalers like Microsoft and Google have committed to achieving net-zero emissions by 2030. However, their emissions have risen substantially since 2020 — Microsoft’s by more than 30 percent and Google’s by over 50 percent.These significant increases cast serious doubt on the companies’ ability to reach their ambitious climate goals, which target net-zero emissions across operations and value chains.
AI Gulps Water
Generative AI and LLMs’ environmental impact extends beyond GHG emissions. Data centers require substantial water for construction, cooling, and humidification — a growing concern given that 25 percent of the global population lacks access to clean water and sanitation. As data centers multiply, their water consumption will only continue to rise.
For example, a 2025 research paper found that “training the GPT-3 language model in Microsoft’s state-of-the-art U.S. data centers can directly evaporate 700,000 liters of clean freshwater” and projected that the global AI demand will “account for 4.2 to 6.6 billion cubic meters of water withdrawal in 2027, which is more than the total annual water withdrawal of 4 to 6 Denmark’s or half of the United Kingdom.”
Again, hyperscalers like Google and Meta have pledged to efficiently manage their water use, but the expansion of their data centers potentially exacerbates the stress on already limited freshwater resources.
The Growing Pile of E-Waste
Another environmental impact to consider: Data centers require an enormous amount of electronic hardware (e.g., GPUs, CPUs, memory modules, and storage devices), which is frequently upgraded and discarded, producing a significant amount of electronic waste.
E-waste is widely acknowledged as one of the world’s fastest-growing waste streams. The United Nations Institute for Training and Research reports, “a record 62 million tonnes of e-waste was produced in 2022, up 82 percent from 2010, on track to rise another 32 percent, to 82 million tons, in 2030.” These figures barely factor in the huge impact data center growth can have on the waste stream. One 2024 research report estimates that the rapid growth of data centers and the development of more powerful hardware is expected to create an additional 16 million tons of e-waste by 2030.
E-waste from the hardware supporting generative AI is especially dangerous because it contains metals like copper, gold, silver, aluminum, and rare earth elements, as well as hazardous materials such as lead, mercury, and chromium. According to the World Health Organization, these materials produce toxic chemicals when recycled inappropriately. Because only about 22 percent of e-waste is properly collected and recycled, these elements contaminate the environment and increase health risks in communities worldwide.
Labor and Human Rights
While AI technology may appear to operate autonomously, it fundamentally depends on extensive human labor throughout its lifecycle. This is particularly true for LLMs, whose performance relies heavily on high-quality training data.
The process of collecting, cleaning, and labeling this data demands intensive human effort, often carried out in developing nations. In these locations, workers typically earn less than $2 per hour to read and label traumatic content under demanding productivity requirements. The psychological toll of this work can be severe, with some workers reporting lasting mental distress — one described the experience as “mental torture.” These working conditions have led AI ethics researchers to characterize these facilities as “digital sweatshops,” highlighting the human cost behind AI development.
On another level, AI technology is fundamentally dependent on rare earth minerals like neodymium, dysprosium, and terbium, which have created an expanding web of human and environmental costs. Mining these essential materials requires energy-intensive operations that not only contribute to AI’s carbon footprint but also generate dangerous toxic waste. The human toll of this extraction is severe, with workers routinely exposed to hazardous chemicals and radiation without proper safety equipment or healthcare access.
In the Democratic Republic of Congo, the situation is particularly alarming: Of the approximately 255,000 Congolese mining for cobalt, a critical component for AI hardware, about 40,000 are children, some as young as six years. The scale of exploitation is staggering: The U.S. Department of Labor estimates that between 67,000 and 80,000 Congolese cobalt miners face various forms of coercion, including excessive overtime, deceptive recruitment, and restricted movement. These workers typically earn around $8 per day, while those subjected to forced labor earn even less — a shocking 41 percent below their peers. This system of exploitation perpetuates cycles of poverty and exacerbates systemic risks associated with income inequality at the cost of fueling the global AI industry’s growth. As Professor Kate Crawford points out in Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence, “to understand the business of AI, we must reckon with the war, famine, and death that mining brings.”
Can AI Help With Environmental Costs?
When it comes to evaluating the costs and benefits of investment managers’ use of AI, some tech leaders assert that AI’s capacity to accelerate climate solutions could outweigh its current environmental impact.
For example, at the 2024 Special Competitive Studies Project Announces AI + Energy Summit, Eric Schmidt, former CEO of Google, was asked if it’s possible to meet AI’s energy needs within the parameters of existing climate goals. Schmidt replied, “My own opinion is that we’re not going to hit the climate goals anyway. […] I’d rather bet on AI solving the problem than constraining it and having the problem.”
Schmidt’s techno-optimism is widely shared in the tech community. There is growing evidence that AI can tackle some of the biggest climate challenges, such as by monitoring and mapping GHG emissions, ocean waste, and deforestation or optimizing energy use in buildings, industrial processes, and grid operations.
However, as Dr. Sasha Luccioni, AI Researcher and the climate lead at Hugging Face, points out, the AI models that are helping with climate challenges “are not the ones contributing, most of the energy and carbon issues that we’re seeing.” LLMs, on the other hand, are the biggest contributors and “have yet to prove their utility in fighting climate change.”
Tech companies are actively working to enhance data center efficiency through multiple technological improvements, including optimized infrastructure, more sophisticated algorithms, and energy-efficient hardware designs. These advancements promise to reduce both the computational costs, and the energy required for each individual computing task. However, this situation presents a fascinating example of the Jevons Paradox, an economic principle first observed in coal consumption during the Industrial Revolution. The paradox states that when technological progress increases the efficiency of resource use, the total consumption of that resource tends to increase rather than decrease. In the context of AI and computing, this suggests that as computing becomes more energy-efficient and cost-effective, organizations will likely expand their AI operations and launch new applications, potentially leading to greater overall energy consumption despite the improved efficiency of individual operations.
In addition, researcher Gianluca Guidi points out that multimodal AI models, with their exponentially larger data requirements, will likely increase overall energy consumption and emissions despite any efficiency gains. Moreover, focusing solely on energy efficiency and renewable power as potential solutions for data center negative impacts overlooks broader environmental and social concerns, including e-waste generation and labor issues in the technology supply chain.
The Challenges of Measuring and Assessing the Costs and Benefits of Using AI
Quantifying AI’s environmental and social impact presents complex challenges. GHG emissions data is dispersed across agencies and regions and is divided into three categories. There’s little transparency around data centers’ GHG emissions, water usage, e-waste generation, and labor practices, and tech companies struggle to develop basic insights into their labyrinthine supply chains.
What’s more, investment managers face the complexity of forecasting long-term energy and water availability and how the developing supply and demand dynamics of these required inputs may impact their bottom line.
However, there are bright spots of progress. As a sign of growing awareness and interest in addressing these issues, academic researchers are increasingly developing methods for measuring GHG emissions and water use in data centers. One example is this interactive portal reporting on data center emissions across the U.S. In addition, international organizations such as UNESCO are developing policy recommendations for monitoring and addressing human rights issues associated with the use of AI.
Further, institutional investor awareness of the challenges is resulting in coalitions of allocators and managers, such as one that includes ABP, Norges Bank, Fidelity International, and Macquarie. This coalition is collectively leveraging influence to compel technology companies to improve transparency around their AI systems’ impacts and develop concrete plans for mitigating negative consequences. However, while investor coalitions provide valuable networks for sharing resources and developing best practices, they cannot replace allocators’ and managers’ implementation of robust internal policies and systems.
Evaluating AI’s environmental and social impacts — even with the current imperfect data and metrics — is vital for understanding its investment benefits. Implementing preliminary assessment frameworks now can serve two vital purposes: Helping organizations begin mitigating negative consequences immediately and creating foundational data and processes that can evolve into more sophisticated evaluation methods. The perfect should not be the enemy of the good.
Angelo Calvello, PhD, writes extensively on AI and climate issues. He is the author of Environmental Alpha: Institutional Investors and Climate Change (Wiley 2011) and founder of C/79 Consulting LLC.
Katherine McGinn, CFA, is a freelance writer and consultant with over 15 years of ESG & Impact Investing experience across institutional investment consulting and private wealth management. She sits on the advisory board of the Tech Forward Investors Initiative, which aims to explore the link between tech industry governance and long-term shareholder value creation.