Link Search Menu Expand Document

Lee Lab @ UW Madison

Our research focuses on developing more effective and efficient algorithms and systems for deep learning with foundation models. We strive to accomplish this by conducting theoretical analysis and by devising principled approaches to enhance the performance of current algorithms and systems.

Selected Work on Deep Learning with Foundation Models

LinkTopicTitleSummaryGithub
NeurIPSW’23TheoryThe Expressive Power of Low-Rank Adaptation (LoRA)SummaryGithub
NeurIPSW’23LLMImage Clustering Conditioned on Text CriteriaSummaryGithub
NeurIPSW’23LLMCoded Prompts for Large Language Models  
NeurIPSW’23CLIPZero-shot Improvement of Object Counting with CLIP  
NeurIPSW’23DiffusionSuper-Resolution Emulation of Large Cosmological Fields with a 3D Conditional Diffusion Model  
NeurIPS’23DiffusionReinforcement learning for improved text-to-image alignmentSummaryGithub
ICML’23LLM, TheoryLooped Transformers as Programmable ComputersSummary 
ICML’23DiffusionReinforcement learning for faster DDPM samplingSummaryGithub
ACL’23 (Findings)LLMAn LLM agent with memory for long-term conversationSummaryGithub
EMNLP’22 (Findings)LLM, CLIPUnsupervised word translation (via connecting two CLIP models)SummaryGithub
NeurIPS’22LLMLIFT: Language-Interfaced Fine-Tuning for Non-Language Machine Learning TasksSummaryGithub
NeurIPS’22Diffusion, TheoryScore-based Generative Modeling Secretly Minimizes the Wasserstein DistanceSummaryGithub
ICMLW’23CLIP, TheoryMini-Batch Optimization of Contrastive Loss Github
ICMLW’23, NeurIPSW’23LLMTeaching arithmetic to a small TransformerSummaryGithub
ICMLW’23LLMA compute-latency trade-off for language model decoding  
ICMLW’23LLMA looped-Transformer architecture for efficient meta-learning  
ACLW’22LLMDebiasing language models via parameter-efficient fine-tuning  

Selected Talks on Deep Learning with Foundation Models

  • (Oct. 2023) Trust Perspectives in Machine Learning, Law, and Public Policy at the Institute for Data, Econometrics, Algorithms, and Learning (IDEAL) @ Northwestern University
  • (Oct. 2023) Distinguished Lectures in Microbiology @ University of Wisconsin-Madison
  • (May 2023) KSEA Distinguished Guest Series
  • (Feb. 2023) Information Theory and Applications Workshop
  • (Feb. 2023) The Coordinated Science Laboratory Student Conference @ UIUC
  • (Jan. 2023) Information Theory and Data Science Workshop @ National University of Singapore
  • (Jan. 2023) Systems, Information, Learning and Optimization (SILO) Seminar @ University of Wisconsin-Madison
  • (Aug. 2022) Samsung Advanced Institute of Technology

News

  • (Sep. 2023) One paper is accepted to [NeurIPS’23]
  • (May 2023) One paper is accepted to [ACL’23 (Findings)]
  • (Apr. 2023) Three papers are accepted to [ICML’23]