图片搜索

   粘贴图片网址
Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks
Wu, Chunyi1; Zhao, Yanhua2; Xiao, Shan1; Gao, Chao1
2023-09-15
发表期刊IEEE SENSORS JOURNAL
ISSN1530-437X
EISSN1558-1748
卷号23期号:18页码:20586-20595
摘要Recently, the explosive data traffic growth and large-scale Internet of Things (IoT) equipment connection have brought 5G mobile communication technology many challenges. The communication delay of network services such as augmented reality, smart grid, disaster warning, and emergency communication has been relieved by 5G. However, these services are still confronted with serious challenge of energy conservation. Most of the traditional optimization methods usually need complex operations and iterations to get optimal results, which are not suitable for communication systems with high real-time performance. Based on this argument, this article has focused on green communication in mobile edge computing (MEC)-based self-powered sensor network and proposed a novel approach called mobile intelligent data synchronization [MIDS based on deep reinforcement learning (DLR)]. It exploits deep deterministic policy gradient (DDPG), continuously interacting with the environment and making tryouts to evaluate the feedback from the environment to optimize future decision-making on the path selection scheme for data transmission as well as achieve the energy efficiency and energy consumption balance of mobile devices with self-powered sensors. We provide experiments to show the capability of our approach in reducing the additional energy consumption of mobile devices and the base station (BS) as well as balancing the energy consumption of mobile devices with sensor communication. The superiority of our approach for energy conservation in data synchronization is convincingly demonstrated by comparing it with state-of-the-art methods in MEC-based self-powered sensor network. © 2001-2012 IEEE.
关键词5G mobile communication systems Augmented reality Decision making Deep learning Energy efficiency Energy utilization Internet of things Mobile edge computing Real time systems Sensor networks Synchronization Data synchronization Data traffic growth Deep reinforcement learning Emergency communication Energy efficient Energy-consumption Reinforcement learnings Self-powered Self-powered sensor Sensors network
DOI10.1109/JSEN.2022.3221983
收录类别EI ; SCIE
语种英语
WOS研究方向Engineering ; Instruments & Instrumentation ; Physics
WOS类目Engineering, Electrical & Electronic ; Instruments & Instrumentation ; Physics, Applied
WOS记录号WOS:001090399700012
出版者Institute of Electrical and Electronics Engineers Inc.
EI入藏号20225113274231
原始文献类型Journal article (JA)
引用统计
被引频次:2[WOS]   [WOS记录]     [WOS相关记录]
文献类型期刊论文
条目标识符https://ir.cqcet.edu.cn/handle/39TD4454/18054
专题智慧健康学院
人工智能与大数据学院
电子与物联网学院
作者单位1.Big Data and Optimization Research Institute, Chongqing College of Electronic Engineering, Chongqing; 401331, China;
2.Chongqing No.1 Middle School, Chongqing; 401331, China
第一作者单位重庆电子科技职业大学
推荐引用方式
GB/T 7714
Wu, Chunyi,Zhao, Yanhua,Xiao, Shan,et al. Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks[J]. IEEE SENSORS JOURNAL,2023,23(18):20586-20595.
APA Wu, Chunyi,Zhao, Yanhua,Xiao, Shan,&Gao, Chao.(2023).Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks.IEEE SENSORS JOURNAL,23(18),20586-20595.
MLA Wu, Chunyi,et al."Explore Deep Reinforcement Learning to Energy-Efficient Data Synchronism in 5G Self-Powered Sensor Networks".IEEE SENSORS JOURNAL 23.18(2023):20586-20595.
条目包含的文件
条目无相关文件。
个性服务
查看访问统计
谷歌学术
谷歌学术中相似的文章
[Wu, Chunyi]的文章
[Zhao, Yanhua]的文章
[Xiao, Shan]的文章
百度学术
百度学术中相似的文章
[Wu, Chunyi]的文章
[Zhao, Yanhua]的文章
[Xiao, Shan]的文章
必应学术
必应学术中相似的文章
[Wu, Chunyi]的文章
[Zhao, Yanhua]的文章
[Xiao, Shan]的文章
相关权益政策
暂无数据
收藏/分享
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。