Institute of Information Science, Academia Sinica

Events

Print

Press Ctrl+P to print from browser

TIGP (SNHCC) -- Energy Management of a Shared Energy Storage System in Residential Community Using Reinforcement Learning

:::

TIGP (SNHCC) -- Energy Management of a Shared Energy Storage System in Residential Community Using Reinforcement Learning

  • LecturerProf. Wei-Yu Chiu (Department of Electrical Engineering, National Tsing Hua University)
    Host: TIGP (SNHCC)
  • Time2023-02-20 (Mon.) 14:00 ~ 16:00
  • LocationAuditorium 106 at IIS new Building
Abstract
An energy storage system (ESS) plays a significant role in modern smart grids. The utilization of an ESS has been considered as an effective solution to balance the renewable energy generation and demand from users. In a traditional setting, each user in a residential sector is equipped with an ESS independently. However, owning an ESS can cause a user extra installation space, high investment, and maintenance fee. To reduce the cost and increase the efficiency of owning an ESS, the concept of a shared ESS has emerged. Existing energy management methods applicable to a shared ESS include nonlinear programming (NLP), Q-learning, and deep Q-network (DQN) based control methods. However, data uncertainty induced from renewable generation and power demand cannot be properly addressed using optimization methods such as NLP, and a scenario involving multiple active users sharing energy using a common ESS does not adequately fit a single-agent framework considered by conventional Q-learning and DQN based methods. This talk investigates a data-driven energy management method based on multiagent DQN (MADQN) with prioritized experience replay and epsilon-greedy decrement to solve a charging-discharging scheduling problem with a shared ESS. The method enables active users to learn from important experiences and cooperatively reduce their costs.
Abstract:
An energy storage system (ESS) plays a significant role in modern smart grids. The utilization of an ESS has been considered as an effective solution to balance the renewable energy generation and demand from users. In a traditional setting, each user in a residential sector is equipped with an ESS independently. However, owning an ESS can cause a user extra installation space, high investment, and maintenance fee. To reduce the cost and increase the efficiency of owning an ESS, the concept of a shared ESS has emerged. Existing energy management methods applicable to a shared ESS include nonlinear programming (NLP), Q-learning, and deep Q-network (DQN) based control methods. However, data uncertainty induced from renewable generation and power demand cannot be properly addressed using optimization methods such as NLP, and a scenario involving multiple active users sharing energy using a common ESS does not adequately fit a single-agent framework considered by conventional Q-learning and DQN based methods. This talk investigates a data-driven energy management method based on multiagent DQN (MADQN) with prioritized experience replay and epsilon-greedy decrement to solve a charging-discharging scheduling problem with a shared ESS. The method enables active users to learn from important experiences and cooperatively reduce their costs.
BIO
Wei-Yu Chiu received his Ph.D. degree in communications engineering from National Tsing Hua University (NTHU), Taiwan. He was a Postdoctoral Research Fellow with the Department of Electrical Engineering, Princeton University, Princeton, NJ, USA; and a Visiting Scholar at Oklahoma State University. He is currently an Associate Professor of Electrical Engineering with NTHU, Taiwan. His research interests include multiobjective optimization, computational intelligence, and their applications to smart grids.