Feb 06, 2025

Policy Primer - Efficient AI

This policy primer outlines some of the challenges around measuring AI model efficiency systematically, and the techniques being developed to improve model efficiency. It focuses on work that can be done at the model developer layer, as opposed to the hardware or energy supply layers.

Authors


Aidan Peppin, Ahmet Üstün, Sudip Roy, Ye Shen, Sara Hooker

Abstract


The growing global demand for digital products and services is increasing energy needs and their associated climate impacts. AI technologies are contributing to this challenge, due to the need for energy-intensive computing hardware, usually housed in data centers, to train and deploy AI models.One approach to addressing this is improving AI model efficiency: reducing the amount of computing power that AI models require for training and real world use, while maintaining or increasing their performance.A range of techniques are emerging to achieve this, across model design, pre-training, data efficiency, fine-tuning, model compression, and hardware optimization. While there are some trade-offs to be navigated, such as ensuring efforts to reduce model size do not reduce model accuracy, research focused on increasing model efficiency demonstrates that bigger is not always better – or even necessary – when it comes to AI models.

Related works