Aldahmashi, Jamal and Ma, Xiandong (2024) Real-time Energy Management in Smart Homes through Deep Reinforcement Learning. IEEE Access, 12. 43155 - 43172. ISSN 2169-3536
Main_Manuscript2b_accepted.pdf - Accepted Version
Available under License Creative Commons Attribution.
Download (1MB)
Abstract
In light of the growing prevalence of distributed energy resources, energy storage systems (ESs), and electric vehicles (EVs) at the residential scale, home energy management (HEM) systems have become instrumental in amplifying economic advantages for consumers. These systems traditionally prioritize curtailing active power consumption, often at an expense of overlooking reactive power. A significant imbalance between active and reactive power can detrimentally impact the power factor in the home-to-grid interface. This research presents an innovative strategy designed to optimize the performance of HEM systems, ensuring they not only meet financial and operational goals but also enhance the power factor. The approach involves the strategic operation of flexible loads, meticulous control of thermostatic load in line with user preferences, and precise determination of active and reactive power values for both ES and EV. This optimizes cost savings and augments the power factor. Recognizing the uncertainties in user behaviors, renewable energy generations, and external temperature fluctuations, our model employs a Markov decision process for depiction. Moreover, the research advances a model-free HEM system grounded in deep reinforcement learning, thereby offering a notable proficiency in handling the multifaceted nature of smart home settings and ensuring real-time optimal load scheduling. Comprehensive assessments using real-world datasets validate our approach. Notably, the proposed methodology can elevate the power factor from 0.44 to 0.9 and achieve a significant 31.5% reduction in electricity bills, while upholding consumer satisfaction.