Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks | |
Wu, Chunyi1![]() ![]() ![]() | |
2023-09-15 | |
发表期刊 | IEEE SENSORS JOURNAL
![]() |
ISSN | 1530-437X |
EISSN | 1558-1748 |
卷号 | 23期号:18页码:20586-20595 |
摘要 | Recently, the explosive data traffic growth and large-scale Internet of Things (IoT) equipment connection have brought 5G mobile communication technology many challenges. The communication delay of network services such as augmented reality, smart grid, disaster warning, and emergency communication has been relieved by 5G. However, these services are still confronted with serious challenge of energy conservation. Most of the traditional optimization methods usually need complex operations and iterations to get optimal results, which are not suitable for communication systems with high real-time performance. Based on this argument, this article has focused on green communication in mobile edge computing (MEC)-based self-powered sensor network and proposed a novel approach called mobile intelligent data synchronization [MIDS based on deep reinforcement learning (DLR)]. It exploits deep deterministic policy gradient (DDPG), continuously interacting with the environment and making tryouts to evaluate the feedback from the environment to optimize future decision-making on the path selection scheme for data transmission as well as achieve the energy efficiency and energy consumption balance of mobile devices with self-powered sensors. We provide experiments to show the capability of our approach in reducing the additional energy consumption of mobile devices and the base station (BS) as well as balancing the energy consumption of mobile devices with sensor communication. The superiority of our approach for energy conservation in data synchronization is convincingly demonstrated by comparing it with state-of-the-art methods in MEC-based self-powered sensor network. © 2001-2012 IEEE. |
关键词 | 5G mobile communication systems Augmented reality Decision making Deep learning Energy efficiency Energy utilization Internet of things Mobile edge computing Real time systems Sensor networks Synchronization Data synchronization Data traffic growth Deep reinforcement learning Emergency communication Energy efficient Energy-consumption Reinforcement learnings Self-powered Self-powered sensor Sensors network |
DOI | 10.1109/JSEN.2022.3221983 |
收录类别 | EI ; SCIE |
语种 | 英语 |
WOS研究方向 | Engineering ; Instruments & Instrumentation ; Physics |
WOS类目 | Engineering, Electrical & Electronic ; Instruments & Instrumentation ; Physics, Applied |
WOS记录号 | WOS:001090399700012 |
出版者 | Institute of Electrical and Electronics Engineers Inc. |
EI入藏号 | 20225113274231 |
原始文献类型 | Journal article (JA) |
引用统计 | |
文献类型 | 期刊论文 |
条目标识符 | https://ir.cqcet.edu.cn/handle/39TD4454/18054 |
专题 | 智慧健康学院 人工智能与大数据学院 电子与物联网学院 |
作者单位 | 1.Big Data and Optimization Research Institute, Chongqing College of Electronic Engineering, Chongqing; 401331, China; 2.Chongqing No.1 Middle School, Chongqing; 401331, China |
第一作者单位 | 重庆电子科技职业大学 |
推荐引用方式 GB/T 7714 | Wu, Chunyi,Zhao, Yanhua,Xiao, Shan,et al. Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks[J]. IEEE SENSORS JOURNAL,2023,23(18):20586-20595. |
APA | Wu, Chunyi,Zhao, Yanhua,Xiao, Shan,&Gao, Chao.(2023).Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks.IEEE SENSORS JOURNAL,23(18),20586-20595. |
MLA | Wu, Chunyi,et al."Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks".IEEE SENSORS JOURNAL 23.18(2023):20586-20595. |
条目包含的文件 | 条目无相关文件。 |
除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。
修改评论