# Lee Lab @ UW Madison

Research focus: Theory and algorithms for *deep learning with foundation models*.

## Selected Work

Link | Topic/Type | TLDR | Summary | Github |
---|---|---|---|---|

Arxiv’24 | LLM/Theory | Dual Operating Modes of In-Context Learning | Summary | Github |

Arxiv’24 | LLM/Algorithm | Can MLLMs Perform Text-to-Image In-Context Learning? | Summary | Github |

ICLR’24 | PEFT/Theory | The Expressive Power of Low-Rank Adaptation (LoRA) | Summary | Github |

ICLR’24 | LLM/Algorithm | Image Clustering Conditioned on Text Criteria | Summary | Github |

ICLR’24 | LLM/Algorithm | Teaching Arithmetic to a Small Transformer | Summary | Github |

ICLR’24 | LLM/Algorithm | A Looped-Transformer Architecture for Efficient Meta-learning | Summary | Github |

NeurIPS’23 | Diffusion/Algorithm | Reinforcement learning for improved text-to-image alignment | Summary | Github |

ICML’23 | LLM/Theory | Looped Transformers as Programmable Computers | Summary | Github |

ICML’23 | Diffusion/Algorithm | Reinforcement learning for faster DDPM sampling | Summary | Github |

NeurIPS’22 | LLM/Algorithm | LIFT: Language-Interfaced Fine-Tuning for Non-Language Machine Learning Tasks | Summary | Github |

NeurIPS’22 | Diffusion/Theory | Score-based Generative Modeling Secretly Minimizes the Wasserstein Distance | Summary | Github |

## Selected Talks on Deep Learning with Foundation Models

- (Apr. 2024) The Johns Hopkins University CIS/MINDS seminar

Title: Theoretical Exploration of Foundation Model Adaptation Methods - (Mar. 2024) The 58th Annual Conference on Information Sciences and Systems @ Princeton University

Title: A Probabilistic Framework for Understanding In-Context Task Learning and Retrieval - (Feb. 2024) 2024 Information Theory and Applications Workshop

Title: The Expressive Power of Low-Rank Adaptation (LoRA) - (Feb. 2024) Foundations of Data Science - Virtual Talk Series @ UCSD/NSF TRIPODS Institute on Emerging CORE Methods in Data Science (EnCORE)

Title:*Theoretical Exploration of Foundation Model Adaptation Methods* - (Dec. 2023) CSP Seminar @ University of Michigan

Title:*Towards a Theoretical Understanding of Parameter-Efficient Fine-Tuning (and Beyond)* - (Nov. 2023) Efficient ML workshop @ Google Research New York

Title: The Expressive Power of Low-Rank Adaptation (LoRA)

## News

- (Mar. 2024)
**NSF CAREER Award**

Our group will develop a unified theory and new algorithms with provable guarantees for learning with frozen pretrained models, also known as foundation models. Huge thanks to NSF and my amazing collaborators and students! - (Feb. 2024) One paper is accepted to
**[TMLR]** - (Jan. 2024) Four papers are accepted to
**[ICLR’24]** - (Sep. 2023) One paper is accepted to
**[NeurIPS’23]** - (May 2023) One paper is accepted to
**[ACL’23 (Findings)]** - (Apr. 2023) Three papers are accepted to
**[ICML’23]**