| 摘要: |
| 为了实现低时延低能耗的近海通信,将动态服务缓存更新机制引入复杂神经网络当中,基于近海通信场景对复杂的神经网络结构进行巧妙设计,提出了基于双重深度Q网络(Double Deep Q Network,DDQN的移动边缘动态服务缓存策略(Mobile Edge Dynamic Service Caching Policy,MEDSCP。该策略先通过用户终端任务卸载决策博弈得到最佳卸载决策集,然后利用移动边缘计算(Mobile Edge Computing,MEC和动态服务缓存更新来减少近海通信环境下任务执行的时延能耗成本,旨在提高近海通信的任务处理效率并扩展该行业的发展潜力。仿真实验结果表明,所提的MEDSCP策略与现有工作相比,能在保证训练效果的前提下实现算法的快速收敛,还能有效降低近海通信的时延能耗加权和。 |
| 关键词: 近海通信 深度强化学习 移动边缘计算 博弈论 动态缓存 |
| DOI:10.20079/j.issn.1001-893x.240618004 |
|
| 基金项目:北京市自然科学基金-海淀原始创新联合基金(L212026;北京市教委科技计划一般项目(KM202211232011 |
|
| Dynamic Service Caching Strategy for Mobile Edge Computing to Aid Offshore Communications |
| ZHUANG Yingyu,PAN Chunyu,LI Xuehua |
| (Key Laboratory of Modern Measurement & Control Technology,Ministry of Information Industry,Beijing Information Science and Technology University,Beijing 100101,China) |
| Abstract: |
| In order to achieve low-latency and low-energy offshore communication,the dynamic service cache update mechanism is introduced into the complex neural network,and the mobile edge dynamic service caching policy(MEDSCP based on double deep Q network(DDQN is proposed by cleverly designing the complex neural network structure based on offshore communication scenarios.The policy firstly obtains the optimal offloading decision set through the user terminal task offloading decision game,and then utilizes mobile edge computing(MEC and dynamic service caching update to reduce the delay and energy cost of task execution in the offshore communication environment,aiming to improve the efficiency of task processing in offshore communication and to expand the development potential of this industry.Simulation experimental results show that the proposed MEDSCP strategy can achieve fast convergence of the algorithm while guaranteeing the training effect,and also effectively reduce the delay-energy weighted sum of offshore communications compared with existing work. |
| Key words: offshore communication deep reinforcement learning mobile edge computing game theory dynamic cache |