Foundation models, which are trained on large-scale data and adapted to many downstream tasks, have demonstrated great success in multiple domains. However, the cost of fine-tuning those models may become impractical since the foundation models have grown rapidly over the past few years. Parameter-efficient transfer learning (PETL) is an approach to only use a few trainable parameters but achieve competitive performance compared with fine-tuning all parameters so it can largely reduce the storage cost and communication cost in distributed training. In this talk, I will introduce several popular methods in PETL, and its applications and also present my recent work built upon this technique.
Yi-Lin Sung is a Ph.D. student at UNC-Chapel Hill, advised by Mohit Bansal. Previously, he researched deep learning and computer vision at Academia Sinica and National Taiwan University. His research expertise and interests lie in parameter-efficient training and vision-and-language learning. Now, his research focus is on how to efficiently train and deploy large foundation models for various tasks with limited resources.