The task force suggested creating new standards, metrics, and definitions for energy use and efficiency metrics. Organizations should track and project data center power usage. Also, it said, the organizations that benefit the most from new infrastructure should bear the brunt of growing costs. Further, the electric grid must be modernized and secured, and AI can be used to enhance energy infrastructure, production, and efficiency.
O’Connor detailed several more steps organizations can take that are “aligned with or supported by government recommendations.” In addition to simply using more energy-efficient hardware, some of these include:
- Optimizing models to reduce energy consumption, by model pruning, or removing redundant neurons from neural networks to reduce model size and computational load.
- Reducing the numerical precision of AI calculations through a technique known as quantization, which can result in up to 50% computational cost savings.
- Training a smaller model to replicate the behaviors of a larger model, reducing the need for extensive computational resources.
Clear and practical or 273 pages too long?
The extensive report also touches on other areas, including financial services, healthcare, data privacy, and R&D, and calls for federal preemption when it comes to AI — that is, federal law taking precedence over state law.