Lee Lab @ UW Madison
Our research focuses on developing more effective and efficient algorithms and systems for deep learning with foundation models. We strive to accomplish this by conducting theoretical analysis and by devising principled approaches to enhance the performance of current algorithms and systems.
Selected Work on Deep Learning with Foundation Models
Link | Topic | Title | Summary | Github |
---|---|---|---|---|
NeurIPSW’23 | Theory | The Expressive Power of Low-Rank Adaptation (LoRA) | Summary | Github |
NeurIPSW’23 | LLM | Image Clustering Conditioned on Text Criteria | Summary | Github |
NeurIPSW’23 | LLM | Coded Prompts for Large Language Models | ||
NeurIPSW’23 | CLIP | Zero-shot Improvement of Object Counting with CLIP | ||
NeurIPSW’23 | Diffusion | Super-Resolution Emulation of Large Cosmological Fields with a 3D Conditional Diffusion Model | ||
NeurIPS’23 | Diffusion | Reinforcement learning for improved text-to-image alignment | Summary | Github |
ICML’23 | LLM, Theory | Looped Transformers as Programmable Computers | Summary | |
ICML’23 | Diffusion | Reinforcement learning for faster DDPM sampling | Summary | Github |
ACL’23 (Findings) | LLM | An LLM agent with memory for long-term conversation | Summary | Github |
EMNLP’22 (Findings) | LLM, CLIP | Unsupervised word translation (via connecting two CLIP models) | Summary | Github |
NeurIPS’22 | LLM | LIFT: Language-Interfaced Fine-Tuning for Non-Language Machine Learning Tasks | Summary | Github |
NeurIPS’22 | Diffusion, Theory | Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance | Summary | Github |
ICMLW’23 | CLIP, Theory | Mini-Batch Optimization of Contrastive Loss | Github | |
ICMLW’23, NeurIPSW’23 | LLM | Teaching arithmetic to a small Transformer | Summary | Github |
ICMLW’23 | LLM | A compute-latency trade-off for language model decoding | ||
ICMLW’23 | LLM | A looped-Transformer architecture for efficient meta-learning | ||
ACLW’22 | LLM | Debiasing language models via parameter-efficient fine-tuning |
Selected Talks on Deep Learning with Foundation Models
- (Oct. 2023) Trust Perspectives in Machine Learning, Law, and Public Policy at the Institute for Data, Econometrics, Algorithms, and Learning (IDEAL) @ Northwestern University
- (Oct. 2023) Distinguished Lectures in Microbiology @ University of Wisconsin-Madison
- (May 2023) KSEA Distinguished Guest Series
- (Feb. 2023) Information Theory and Applications Workshop
- (Feb. 2023) The Coordinated Science Laboratory Student Conference @ UIUC
- (Jan. 2023) Information Theory and Data Science Workshop @ National University of Singapore
- (Jan. 2023) Systems, Information, Learning and Optimization (SILO) Seminar @ University of Wisconsin-Madison
- (Aug. 2022) Samsung Advanced Institute of Technology
News
- (Sep. 2023) One paper is accepted to [NeurIPS’23]
- (May 2023) One paper is accepted to [ACL’23 (Findings)]
- (Apr. 2023) Three papers are accepted to [ICML’23]