Invited talks at The Coordinated Science Laboratory Student Conference (Feb. 2023), Information Theory and Applications Workshop (Feb. 2023), Information Theory and Data Science Workshop (Jan. 2023), UW Madison SILO (Jan. 2023)
[CLIP] WALIP (connecting two CLIP modles for word translation): Findings of EMNLP’22
Invited talks at US-Mexico Workshop on Optimization and its Applications (2023), KAIST AI International Symposium (2022), IOS’22, UCSB CCDC (2022), USC EE (2022), UC Berkeley BLISS (2021)
[Coded computation] Coded-InvNet (coded computation for deep invertible neural networks): ICML’21
Invited talks at Seoul National University (2021), POSTECH AI (2021)
Improving Fair Training under Correlation Shifts Yuji Roh, Kangwook Lee, Steven Euijong Whang, Changho Suh Arxiv 2023
Optimizing DDPM Sampling with Shortcut Fine-Tuning Ying Fan, Kangwook Lee Arxiv 2023
Looped Transformers as Programmable Computers Angeliki Giannou, Shashank Rajput, Jy-yong Sohn, Kangwook Lee, Jason D. Lee, Dimitris Papailiopoulos Arxiv, 2023
A Better Way to Decay: Proximal Gradient Training Algorithms for Neural Nets Liu Yang, Jifan Zhang, Joseph Shenouda, Dimitris Papailiopoulos, Kangwook Lee, and Robert D. Nowak Arxiv, NeurIPS’22 OPT Workshop, 2022
Outlier-Robust Group Inference via Gradient Space Clustering Yuchen Zeng, Kristjan Greenewald, Kangwook Lee, Justin Solomon, Mikhail Yurochkin Arxiv, 2022