Ad Image

Aligning Innovation and Sustainability: What Every Corporate AI Strategy Should Consider

Schellman’s Avani Desai offers commentary on aligning innovation and sustainability and what every corporate AI strategy should consider. This article originally appeared in Insight Jam, an enterprise IT community that enables human conversation on AI.

When businesses begin to consider an artificial intelligence (AI) strategy, the conversation tends to center on the possibilities for innovation, efficiency, and gaining a competitive edge. But behind these shiny deliverables is a quiet yet growing concern: AI’s environmental impact.

Despite racing to integrate new AI tools into everything from customer support to inventory and supply chain management, few organizations are calculating the energy, water, and emissions costs that come with these implementations. However, the reality is that AI—specifically, large language models (LLMs) and image generators—consumes astonishing amounts of natural resources and power. So much so that we’ve arrived at a crossroads, whether organizations actively realize it or not.

The intersection of an organization’s AI implementation and environmental goals can no longer be ignored. It’s time to recognize and start treating AI as both a powerful tool and a potential drain on the environment. If businesses are serious about meeting their sustainability goals, their AI implementation strategy must be part of those conversations.

The Hidden Resource Demands

When using an AI model, the process feels abstract. Much like when sending an email or text message into “the ether,” it’s easy to forget that AI interfaces are based on tangible infrastructure. Behind the scenes, massive data centers, sophisticated cooling systems, and high-performance processing chips absorb resources to keep these systems in operation.

Here are just some of the basic environmental effects of AI:

  • Electricity. A single ChatGPT query consumes about five times more electricity than a standard web search. Training an LLM like GPT-3 takes significantly more.
  • Water. It takes a surprising amount of water to run a GPT query. Data centers must stay cool to operate efficiently, and water is the go-to cooling solution. It’s estimated that, depending on the query requirements, it takes roughly two liters of water for every 10 to 50 responses. Scale that into the billions of queries processed daily, with an estimated 80% of that being potable water, and it’s no wonder it’s considered freshwater resource-intensive.
  • Emissions. Building data centers, manufacturing processing chips, and operating the complex infrastructure all impact an organization’s emissions. The World Bank estimates that the broader internet and communication technology (ICT) sector—AI  included—accounts for at least 1.7% of total global emissions, with that number set to grow.

The Paradox of AI: Both Problem and Solution

While those statistics seem to present a clear-cut argument against AI for environmental reasons, it’s important to look at the issue from all angles. Although AI may be contributing to climate change, it’s also helping to fight it.

When used thoughtfully, AI can accelerate sustainability efforts by:

  • Modeling climate scenarios and predicting extreme weather events based on pattern and anomaly detection.
  • Optimizing energy grids and forecasting demand, ensuring strategic distribution and fewer surges or depletions.
  • Improving materials and workflows for cleaner, more efficient manufacturing.
  • Tracking emissions and analyzing truck or shipload demand and distribution for optimized supply chain processes.

With these being just a few of the many developing examples of AI’s positive environmental impact,  the question becomes about balance—it’s not just about reducing AI’s emissions, but also about finding the breakeven point where emissions saved outweigh emission costs.

In other words, we need to ensure the way we build and use AI doesn’t cancel out the gains it enables elsewhere.

How are Industry Leaders Taking Action to Reduce Their Overall Footprint?

Fortunately, industry leaders and some of the world’s leading technology companies are already tackling this challenge head-on.

Amazon, for example, now matches 100% of its global operations with renewable energy. Microsoft is shifting to 100 percent carbon-free electricity by 2030—and will require all suppliers to do the same—with a goal of eventually becoming carbon-negative. Meanwhile, Salesforce has launched a policy initiative advocating for the required reporting of AI emissions and efficiency metrics.

These actions are about more than just brand optics. By making these statements, these organizations are paving the way to meet the very real challenge of scaling AI without abandoning prior climate commitments. With the right forethought, your company can do the same.

Take Action Today: How to Align Your Business’s AI Adoption and Emissions Goals

Whether you’re a global enterprise well into your AI journey or a midsized business at the precipice of change, you can begin the work toward environmentally conscious AI right now by following these four steps.

Step 1: Measure Environmental Impact

Blind action with the goal of reduction will always be less effective than a strategic approach. That’s why it’s important to first measure your organization’s impact to the best of your ability so you can make adjustments where they matter most.

To get an understanding of your AI-related resource consumption and its impact, start by identifying your AI use—where and how it’s being used, including in partnership with which vendors. Once that’s done, work with IT and cloud vendors to estimate the energy and water use associated with your workloads, or estimate on your own using free carbon counting tools.

The adage holds: you can’t reduce what you don’t measure. Visibility is the first step toward accountability.

Step 2: Choose Efficient AI Workflows

Next, you have several options to customize your AI workflows to minimize energy and water use without compromising capability. These include:

  • Using pre-trained models. Instead of training all models from scratch, take advantage of those that already exist where possible.
  • Choosing the correct size model. Smaller, more targeted models use less power than larger ones. Use more distilled models for specific tasks to reduce the resource consumption per query.
  • Batching repetitive tasks. Group and time recurring tasks like reporting and scanning to avoid unnecessary power spikes from overlapping queries.
  • Checking your retraining schedule. For internal AI models, evaluate the frequency with which you retrain to avoid unnecessary resource use.

Efficiency in everything from which models you use to how often you use them has a real impact on emissions and natural resource concerns.

Step 3: Align with International Organization for Standardization (ISO) Standards

Once your organization gets on the right track, you can consider adopting a broader standard. Among your options are some from the International Organization for Standardization (ISO), which is an independent, non-governmental, and international body that brings together experts from around the world to develop voluntary, consensus-based standards.

As part of that mission, the ISO 14001 framework helps organizations implement an environmental management system (EMS) to guide emissions tracking and reduction. Because ISO 14001 certification signals alignment with environmental best practices, it can help businesses operate in line with their sustainability goals.

Meanwhile, ISO 42001 provides requirements for integrating and maintaining an Artificial Intelligence Management System (AIMS) that supports clean governance while managing risks. Specifically designed to address responsible AI systems integration, ISO 42001 certification helps to prove that your AI systems are being deployed responsibly and sustainably.

Step 4: Build a Culture of Sustainability Around AI

Amidst all this, you must recognize that prolonged environmental conscientiousness is a shared responsibility. So, when building an organizational culture of sustainability around AI, it’s important to examine all internal workflows as well as those across your broader ecosystem, including vendors and partners.

Internally, your culture drives action. Basic training on AI’s environmental costs, strategies to reduce resource usage, and the company’s sustainability goals can help your team make smarter and more resource-friendly choices in their day-to-day workflows.

Externally, your footprint includes your supply chain, and sustainable procurement is one of the best ways to lower your overall impact. To verify vendor and partner sustainability, certifications like those mentioned above (ISO 14001 and ISO 42001) can help quickly identify alignment. Vendors can also help you understand your portion of shared emissions via responsibility reporting upon request.

Looking Ahead: AI and Sustainability Are Not at Odds

At the end of the day, AI is not definitively “good” or “bad” for the environment—it all boils down to how it’s used. We, as users, control the extent to which this technology helps or hurts the environment.

Given this, the best approach for IT leaders, sustainability champions, software buyers, and stakeholders alike is to act intentionally: aligning emissions goals with AI innovation comes down to continuous measurement, monitoring, and optimizing for long-term impact up and down the supply chain.

Share This

Related Posts