Achieving a Sustainable Future for AI MIT Technology Review

As the carbon footprint of AI expands, more sustainable AI projects and new best practices are essential.

We are witnessing a historic global paradigm shift driven by dramatic improvements in AI. As AI has evolved from predictive to generative, more and more businesses are realizing this, with business adoption of AI more than doubling since 2017. According to McKinsey, 63% of respondents expect their technology investment own organization in AI will increase over the next three years.

In parallel with this unprecedented adoption of AI, the volume of computing is also increasing at an astonishing rate. Since 2012, the amount of compute used in the largest AI training runs has increased more than 300,000 times. However, as significant processing demands increase, significant environmental implications arise.

More computing leads to more electricity consumption and subsequent carbon emissions. A 2019 study by researchers at the University of Massachusetts Amherst estimated that the electricity consumed while training a transformer, a type of deep learning algorithm, can emit more than 626,000 pounds (~284 metric tons) of carbon dioxide carbon equal to more than 41 round-trip flights between New York City and Sydney, Australia. And that’s just training the model.

This story is only available to subscribers.

Don’t settle for half the story.
Get paywall-free access to tech news for the here and now.

subscribe now
Already a subscriber? Registration

We are also facing an explosion of data storage. IDC predicts that 180 zettabytes of data, or 180 billion terabytes, will be created in 2025. The collective energy required to store data on this scale is enormous and will be difficult to manage sustainably. Depending on data storage conditions (eg, hardware used, facility energy mix), a single terabyte of data stored can produce 2 tons of CO2 emissions per year. Now multiply that by 180 billion.

This current trajectory of ramping up AI with an ever-increasing environmental footprint is simply not sustainable. We need to rethink the status quo and change our strategies and behaviors.

Drive sustainable improvements with AI

While there are undoubtedly serious carbon emissions implications with the increased prominence of artificial intelligence, there are also huge opportunities. Real-time data collection combined with AI actually can help companies quickly identify areas for operational improvement to help reduce carbon emissions at scale.

For example, AI models can identify immediate improvement opportunities for factors that affect building efficiency, including heating, ventilation and air conditioning (HVAC). As a complex, data-rich, multivariable system, HVAC is well suited for automated optimization, and improvements can lead to energy savings within months. While this opportunity exists in nearly every building, it’s especially useful in data centers. Several years ago, Google shared how implementing AI to improve data center cooling reduced energy consumption by up to 40%.

Artificial intelligence is also proving effective for implementing carbon-aware computing. Automatically shifting computing activities, based on the availability of renewable energy sources, can reduce the carbon footprint of the business.

Similarly, AI can help reduce the growing data storage problem mentioned earlier. To address the sustainability issues of large-scale data storage, Gerry McGovern, in his book Waste all over the world, recognized that up to 90% of data is not used and simply archived. AI can help determine what data is valuable, necessary, and of high enough quality to warrant archiving. Superfluous data can simply be deleted, saving costs and energy.

How to design AI projects more sustainably

To responsibly implement AI initiatives, we all need to rethink some things and take a more proactive approach to designing AI projects.

Start with a critical examination of the business problem you are trying to solve. Ask: Do I really need AI to solve this problem or can traditional probabilistic methods with lower power and computational requirements suffice? Deep learning isn’t the solution to all problems, so it pays to be selective when making your decision.

After clarifying your business problem or use case, carefully consider the following as you build your solution and model:

  1. Emphasize data quality over data quantity. Smaller datasets require less energy to train and have lighter ongoing processing and storage implications, therefore producing fewer carbon emissions. Studies show that many of the parameters within a trained neural network can be pruned by up to 99%, resulting in much smaller and more sparse networks.
  2. Consider the level of accuracy really needed to solve your use case. For example, if you were to fine-tune your models for a lower precision assumption calculation, rather than compute-intensive FP32 calculations, you can achieve significant energy savings.
  3. Leverage domain-specific templates and stop reinventing the wheel. Orchestrating a model nugget from existing trained datasets can give you better results. For example, if you already have a large model trained to understand the semantics of the language, you can create a smaller, domain-specific model customized to your needs that draws on the knowledge base of the larger models, resulting in similar results with much more efficiency.
  4. Balance your hardware and software from edge to cloud. A more heterogeneous AI infrastructure, with a mix of AI computing chipsets that address specific application needs, will ensure you save energy across the board, from storage to networking to compute. While the SWaP constraints of edge devices (size, weight, and power) require smaller, more efficient AI models, AI computations closer to where the data is generated can lead to a more carbon-efficient computation with lower power devices and smaller network and data storage requirements. Additionally, for dedicated AI hardware, using built-in acceleration technologies to boost performance per watt can lead to significant power savings. Our tests show that integrated accelerators can improve average performance per watt efficiency by 3.9x on targeted workloads compared to the same workloads running on the same platform without accelerators. (Results may vary.)
  5. Consider open source solutions with libraries of optimizations to help you get the best performance out of your hardware and out of the box frameworks. In addition to open source, adopting open standards can help with repeatability and scalability. For example, to avoid energy-intensive upfront model training, consider using pre-trained models for greater efficiency and the potential for shared/federated learnings and improvements over time. Likewise, open APIs enable more efficient multi-architecture solutions, allowing you to build tools, frameworks, and models once and deploy them anywhere with more optimal performance.

Like many sustainability-based decisions, designing your AI projects to reduce their environmental impact isn’t easy. Reducing your energy and carbon footprint takes work, intention and compromise to make the most responsible choices. But as we see in other sustainability-driven business decisions, even seemingly small adjustments can create large collective improvements to reduce carbon emissions and help slow the effects of climate change.

To learn more about how Intel can help you achieve your sustainable computing goals, visit intel.com/sustainability.

This content was produced by Intel. It was not written by the editorial board of MIT Technology Reviews.

#Achieving #Sustainable #Future #MIT #Technology #Review
Image Source : www.technologyreview.com

Leave a Comment