Institute of Information Science, Academia Sinica



Press Ctrl+P to print from browser

Parameter-efficient Transfer Learning and Its Applications


Parameter-efficient Transfer Learning and Its Applications

  • LecturerMr. Yi-Lin Sung (UNC-Chapel Hill, Computer Science)
    Host: Tyng-Luh Liu
  • Time2023-01-05 (Thu.) 10:00 – 12:00
  • LocationAuditorium 106 at IIS new Building
Foundation models, which are trained on large-scale data and adapted to many downstream tasks, have demonstrated great success in multiple domains. However, the cost of fine-tuning those models may become impractical since the foundation models have grown rapidly over the past few years. Parameter-efficient transfer learning (PETL) is an approach to only use a few trainable parameters but achieve competitive performance compared with fine-tuning all parameters so it can largely reduce the storage cost and communication cost in distributed training. In this talk, I will introduce several popular methods in PETL, and its applications and also present my recent work built upon this technique.
Yi-Lin Sung is a Ph.D. student at UNC-Chapel Hill, advised by Mohit Bansal. Previously, he researched deep learning and computer vision at Academia Sinica and National Taiwan University. His research expertise and interests lie in parameter-efficient training and vision-and-language learning. Now, his research focus is on how to efficiently train and deploy large foundation models for various tasks with limited resources.