Written by Adam Covati, VP of Product, Greenplaces
The rapid adoption of large language models (LLMs) like ChatGPT has spotlighted concerns around their significant energy consumption and environmental impacts. Thoughtful implementation, however, can help businesses leverage AI responsibly and sustainably.
This blog offers practical steps for developers, product teams, and operational leaders to minimize AI’s carbon footprint—especially during inference, rather than the initial training phase.
AI’s growing energy footprint: Understanding the scale
Mid-sized data centers consume roughly 300,000 gallons of water daily—equivalent to the usage of about 1,000 U.S. households.
The global energy demands of data centers are expected to more than double between 2022 and 2026, driven heavily by increased AI adoption. And by 2030, data centers may represent up to 21% of global electricity demand, dramatically escalating from today’s 1–2%.
1. Track and measure AI usage with granular precision
Precise tracking is essential for reducing emissions. Measuring AI operations at a granular level reveals energy-intensive tasks, uncovering optimization opportunities and driving smarter sustainability decisions.
Key metrics to track:
- Token usage per API call/session
- Model type/version (e.g., GPT-4, Llama 3.1)
- Cloud region or data center location (due to varying emissions intensities)
- Frequency and volume of queries
Why this matters: Processing a million tokens (around $1 of compute) can emit as much carbon as driving a gasoline-powered car up to 20 miles. And generating a single image using AI equals the energy needed to fully charge a smartphone.
2. Design AI workflows for efficiency
Optimizing prompts and workflows reduces unnecessary computing demands. Efficient designs don’t just save money—they significantly reduce your emissions footprint.
Practical recommendations:
- Use smaller AI models for simpler tasks (GPT-3.5 vs. GPT-4)
- Run non-urgent tasks during low-emissions grid periods
- Avoid redundant queries and excessive verbosity
- Implement caching for repeated requests
Real-world impact: Optimizing AI workflows and carefully scheduling workloads can achieve emission reductions of 10–20% for data centers.
3. Strategically test AI models and infrastructure
Regular testing of AI workflows reveals inefficiencies and sustainability opportunities. A structured testing approach enables efficient model selection and lowers carbon emissions.
Testing benefits:
- Confirms output accuracy for sustainability tradeoffs
- Identifies models providing optimal balance of performance and energy use
- Facilitates easier shifts to greener technology as it emerges
Pro tip: Use open-source carbon tracking tools like CodeCarbon or ML CO₂ Calculator. These integrate directly with your systems to provide real-time carbon insights per AI task, enabling granular tracking for reporting and optimization.
4. Prioritize sustainable infrastructure choices
Choosing eco-friendly infrastructure has the most immediate impact on your AI’s emissions profile. Data centers powered by renewable energy or optimized cooling systems drastically reduce emissions.
Infrastructure selection strategies:
- Favor providers with transparent emissions dashboards
- Opt for data centers powered predominantly by renewables (Oregon, Iceland)
- Employ carbon-aware scheduling for compute-intensive tasks
Case in point: Advancements like Microsoft’s cold-plate cooling technology could cut emissions and energy consumption by approximately 15% and reduce water usage by 30–50% compared to traditional air cooling. Learn more.
5. Contextualize AI vs. traditional computing appropriately
Comparisons between LLMs and traditional web searches must be contextualized properly. Though LLMs use more energy per interaction, one comprehensive AI query can often replace multiple traditional searches, potentially offering net efficiency gains.
Factors to consider:
- Multiple traditional searches vs. one complex AI task
- Energy for indexing and browsing conventional searches
- User-side energy consumption in conventional research methods
6. Manage usage growth with governance
Efficiency alone isn’t enough. Due to “Jevons Paradox,” efficiency can inadvertently lead to increased total usage. Organizations must actively manage and govern AI adoption to avoid offsetting sustainability gains.
Actionable steps:
- Clearly communicate expectations for AI usage
- Employ governance frameworks to guide AI deployment
- Foster industry-wide collaborations to enhance sustainable practices
Spotlight: How Photoroom cut 1,000+ metric tons of CO₂
By strategically scheduling workloads and optimizing AI inference, Photoroom achieved impressive carbon savings while boosting performance and reducing costs.
A sustainable path forward
The environmental impact of AI is considerable, but it doesn’t have to be prohibitive.
Businesses that proactively measure their emissions, implement efficient AI workflows, and strategically select sustainable infrastructure will significantly mitigate AI’s environmental footprint.
Sustainability in AI isn’t just attainable—it’s imperative and beneficial.