The Hidden Cost of Intelligence: The Energy Footprint of AI from Code to GPU Kernels
- Track: AI Plumbers
- Room: UD2.120 (Chavanne)
- Day: Saturday
- Start: 15:30
- End: 15:35
- Video only: ud2120
- Chat: Join the conversation!
The growing energy demands of modern AI models pose a significant barrier to sustainable computing. As model complexity and deployment scale continue to rise, training and inference increasingly contribute to carbon emissions and operational costs. This talk begins by examining the technical challenges of accurately measuring energy consumption at multiple levels of abstraction—from system-wide and process-level metrics down to individual source code methods and API calls. Practical strategies for overcoming these measurement hurdles are discussed. The second part of the talk explores power consumption patterns in GPU kernels, highlighting how thread configuration, block geometry, and power limit settings shape kernel-level energy efficiency. We demonstrate how these characteristics influence power draw and discuss techniques for predicting consumption based on kernel properties. The session concludes with insights and best practices for managing performance–energy trade-offs in GPU-accelerated AI applications, offering a path toward more sustainable AI development.
Speakers
| Tushar Sharma |