< Back to more papers

Policy Primer - Efficient AI

AUTHORS

Aidan Peppin, Ahmet Üstün, Sudip Roy, Ye Shen, Sara Hooker

ABSTRACT

The growing global demand for digital products and services is increasing energy needs and their associated climate impacts. AI technologies are contributing to this challenge, due to the need for energy-intensive computing hardware, usually housed in data centers, to train and deploy AI models.

One approach to addressing this is improving AI model efficiency: reducing the amount of computing power that AI models require for training and real world use, while maintaining or increasing their performance.

A range of techniques are emerging to achieve this, across model design, pre-training, data efficiency, fine-tuning, model compression, and hardware optimization. While there are some trade-offs to be navigated, such as ensuring efforts to reduce model size do not reduce model accuracy, research focused on increasing model efficiency demonstrates that bigger is not always better – or even necessary – when it comes to AI models.