Anderson, Thomas, Adam Belay, Mosharaf Chowdhury, Asaf Cidon, and Irene Zhang. “Treehouse: A Case For Carbon-Aware Datacenter Software.”
SIGENERGY Energy Inform. Rev. 3, no. 3 (2023): 64–70.
https://doi.org/10.1145/3630614.3630626.
Chung, Jae-Won, Yile Gu, Insu Jang, Luoxi Meng, Nikhil Bansal, and Mosharaf Chowdhury. “Reducing Energy Bloat in Large Model Training.”
Proceedings of the ACM SIGOPS 30th Symposium on Operating Systems Principles, November 4, 2024, 144–59.
https://doi.org/10.1145/3694715.3695970.
Chung, Jae-Won, Yile Gu, Insu Jang, Luoxi Meng, Nikhil Bansal, and Mosharaf Chowdhury. “Reducing Energy Bloat in Large Model Training.”
Proceedings of the ACM SIGOPS 30th Symposium on Operating Systems Principles (New York, NY, USA), SOSP ’24, Association for Computing Machinery, November 15, 2024, 144–59.
https://doi.org/10.1145/3694715.3695970.
Chung, Jae-Won, Jiachen Liu, Jeff J. Ma, et al. “The ML.ENERGY Benchmark: Toward Automated Inference Energy Measurement and Optimization.” arXiv:2505.06371. Preprint, arXiv, May 9, 2025.
https://doi.org/10.48550/arXiv.2505.06371.
Strubell, Emma, Ananya Ganesh, and Andrew McCallum. “Energy and Policy Considerations for Deep Learning in NLP.” arXiv:1906.02243. Preprint, arXiv, June 5, 2019.
https://doi.org/10.48550/arXiv.1906.02243.
U-M Information and Technology Services. “AI and Sustainability.” April 29, 2025.
https://its.umich.edu/computing/ai/ai-and-sustainability.
Wan, Zhongwei, Xin Wang, Che Liu, et al. “Efficient Large Language Models: A Survey.”
Transactions on Machine Learning Research, January 15, 2024.
https://openreview.net/forum?id=bsCCJHbO8A.
You, Jie, Jae-Won Chung, and Mosharaf Chowdhury. “Zeus: Understanding and Optimizing {GPU} Energy Consumption of {DNN} Training.” 2023, 119–39.
https://www.usenix.org/conference/nsdi23/presentation/you.