中央研究院 資訊科學研究所

活動訊息

友善列印

列印可使用瀏覽器提供的(Ctrl+P)功能

Parameter-efficient Transfer Learning and Its Applications

:::

Parameter-efficient Transfer Learning and Its Applications

  • 講者宋易霖 先生 (UNC-Chapel Hill, Computer Science)
    邀請人:劉庭祿
  • 時間2023-01-05 (Thu.) 10:00 – 12:00
  • 地點資訊所新館106演講廳
摘要
Foundation models, which are trained on large-scale data and adapted to many downstream tasks, have demonstrated great success in multiple domains. However, the cost of fine-tuning those models may become impractical since the foundation models have grown rapidly over the past few years. Parameter-efficient transfer learning (PETL) is an approach to only use a few trainable parameters but achieve competitive performance compared with fine-tuning all parameters so it can largely reduce the storage cost and communication cost in distributed training. In this talk, I will introduce several popular methods in PETL, and its applications and also present my recent work built upon this technique.
BIO
Yi-Lin Sung is a Ph.D. student at UNC-Chapel Hill, advised by Mohit Bansal. Previously, he researched deep learning and computer vision at Academia Sinica and National Taiwan University. His research expertise and interests lie in parameter-efficient training and vision-and-language learning. Now, his research focus is on how to efficiently train and deploy large foundation models for various tasks with limited resources.