Real-time Energy Management in Smart Homes through Deep Reinforcement Learning

Aldahmashi, Jamal and Ma, Xiandong (2024) Real-time Energy Management in Smart Homes through Deep Reinforcement Learning. IEEE Access, 12. 43155 - 43172. ISSN 2169-3536

[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted) - Accepted Version
Available under License Creative Commons Attribution.

Download (0B)
[thumbnail of Main Manuscript2b_accepted]
Text (Main Manuscript2b_accepted)
Main_Manuscript2b_accepted.pdf - Accepted Version
Available under License Creative Commons Attribution.

Download (1MB)

Abstract

In light of the growing prevalence of distributed energy resources, energy storage systems (ESs), and electric vehicles (EVs) at the residential scale, home energy management (HEM) systems have become instrumental in amplifying economic advantages for consumers. These systems traditionally prioritize curtailing active power consumption, often at an expense of overlooking reactive power. A significant imbalance between active and reactive power can detrimentally impact the power factor in the home-to-grid interface. This research presents an innovative strategy designed to optimize the performance of HEM systems, ensuring they not only meet financial and operational goals but also enhance the power factor. The approach involves the strategic operation of flexible loads, meticulous control of thermostatic load in line with user preferences, and precise determination of active and reactive power values for both ES and EV. This optimizes cost savings and augments the power factor. Recognizing the uncertainties in user behaviors, renewable energy generations, and external temperature fluctuations, our model employs a Markov decision process for depiction. Moreover, the research advances a model-free HEM system grounded in deep reinforcement learning, thereby offering a notable proficiency in handling the multifaceted nature of smart home settings and ensuring real-time optimal load scheduling. Comprehensive assessments using real-world datasets validate our approach. Notably, the proposed methodology can elevate the power factor from 0.44 to 0.9 and achieve a significant 31.5% reduction in electricity bills, while upholding consumer satisfaction.

Item Type:
Journal Article
Journal or Publication Title:
IEEE Access
Uncontrolled Keywords:
/dk/atira/pure/subjectarea/asjc/2200
Subjects:
?? adaptation modelsdeep reinforcement learningelectricityenergy consumptionenergy managementhome appliancesoptimizationpower factor correctionreactive powerreal-time systemsschedulingsmart homesuncertaintyappliances schedulingdeep reinforcement learninghome ??
ID Code:
216297
Deposited By:
Deposited On:
14 Mar 2024 10:00
Refereed?:
Yes
Published?:
Published
Last Modified:
04 May 2024 00:23