Adaptive reinforcement learning for energy management : A progressive approach to boost climate resilience and energy flexibility
(2025) In Advances in Applied Energy 17.- Abstract
Energy management in urban areas is challenging due to diverse energy users, dynamics environmental conditions, and the added complexity and instability of extreme weather events. We incorporate adaptive reinforcement learning (ARL) into energy management (EM) and introduce a novel approach, called ARLEM. An online, value-based, model-free ARL engine is designed that updates its policy periodically and partially by replacing less favorable actions with those better adapted to evolving environmental conditions. Multiple policy update mechanisms are assessed, varying based on the frequency and length of updates and the action selection criteria. ARLEM is tested to control the energy performance of typical urban blocks in Madrid and... (More)
Energy management in urban areas is challenging due to diverse energy users, dynamics environmental conditions, and the added complexity and instability of extreme weather events. We incorporate adaptive reinforcement learning (ARL) into energy management (EM) and introduce a novel approach, called ARLEM. An online, value-based, model-free ARL engine is designed that updates its policy periodically and partially by replacing less favorable actions with those better adapted to evolving environmental conditions. Multiple policy update mechanisms are assessed, varying based on the frequency and length of updates and the action selection criteria. ARLEM is tested to control the energy performance of typical urban blocks in Madrid and Stockholm considering 17 future climate scenarios for 2040–2069. Each block contains 24 buildings of different types and ages. In Madrid, ARLEM is tested for a summer with two heatwaves and in Stockholm for a winter with two cold waves. Three performance indicators are defined to evaluate the effectiveness and resilience of different control approaches during extreme weather events. ARLEM demonstrates an ability to increase climate resilience in the studied blocks by increasing energy flexibility in the network and reducing both average and peak energy demands while affecting indoor thermal comfort marginally. Since the approach does not require any information about the system dynamics, it is easy to cope with the complexities of building systems and technologies, making it an affordable technology to control large urban areas with diverse types of buildings.
(Less)
- author
- Nik, Vahid M.
LU
and Javanroodi, Kavan LU
- organization
- publishing date
- 2025-03
- type
- Contribution to journal
- publication status
- published
- subject
- keywords
- Adaptive reinforcement learning, Climate resilience, Decentralized control, Energy flexibility, Energy management
- in
- Advances in Applied Energy
- volume
- 17
- article number
- 100213
- publisher
- Elsevier
- external identifiers
-
- scopus:85215953317
- ISSN
- 2666-7924
- DOI
- 10.1016/j.adapen.2025.100213
- language
- English
- LU publication?
- yes
- additional info
- Publisher Copyright: © 2025
- id
- 27570bfc-5f56-4ec9-8f24-6838de8cf0fb
- date added to LUP
- 2025-02-09 13:18:07
- date last changed
- 2025-04-04 14:29:12
@article{27570bfc-5f56-4ec9-8f24-6838de8cf0fb, abstract = {{<p>Energy management in urban areas is challenging due to diverse energy users, dynamics environmental conditions, and the added complexity and instability of extreme weather events. We incorporate adaptive reinforcement learning (ARL) into energy management (EM) and introduce a novel approach, called ARLEM. An online, value-based, model-free ARL engine is designed that updates its policy periodically and partially by replacing less favorable actions with those better adapted to evolving environmental conditions. Multiple policy update mechanisms are assessed, varying based on the frequency and length of updates and the action selection criteria. ARLEM is tested to control the energy performance of typical urban blocks in Madrid and Stockholm considering 17 future climate scenarios for 2040–2069. Each block contains 24 buildings of different types and ages. In Madrid, ARLEM is tested for a summer with two heatwaves and in Stockholm for a winter with two cold waves. Three performance indicators are defined to evaluate the effectiveness and resilience of different control approaches during extreme weather events. ARLEM demonstrates an ability to increase climate resilience in the studied blocks by increasing energy flexibility in the network and reducing both average and peak energy demands while affecting indoor thermal comfort marginally. Since the approach does not require any information about the system dynamics, it is easy to cope with the complexities of building systems and technologies, making it an affordable technology to control large urban areas with diverse types of buildings.</p>}}, author = {{Nik, Vahid M. and Javanroodi, Kavan}}, issn = {{2666-7924}}, keywords = {{Adaptive reinforcement learning; Climate resilience; Decentralized control; Energy flexibility; Energy management}}, language = {{eng}}, publisher = {{Elsevier}}, series = {{Advances in Applied Energy}}, title = {{Adaptive reinforcement learning for energy management : A progressive approach to boost climate resilience and energy flexibility}}, url = {{http://dx.doi.org/10.1016/j.adapen.2025.100213}}, doi = {{10.1016/j.adapen.2025.100213}}, volume = {{17}}, year = {{2025}}, }