摘要: |
为了在可重构智能反射面(Reconfigurable Intelligent Surface,RIS)辅助通信系统中精确估计信道,并解决信道估计开销过高的问题,提出了一种基于快速超分辨率卷积神经网络(Fast Super-Resolution Convolutional Neural Network,FSRCNN)的信道估计方案。在信道估计的初始阶段,选择关闭部分反射元件,并借助少量导频信号完成信道估计,将估计结果视为低精度与低分辨率(Low Resolution,LR)的图像,通过线性插值将其扩展为具有低精度的高分辨率(High Revolution,HR)图像。随后,利用FSRCNN提高估计结果的精度,并通过基于深度残差网络的噪声去除模型(CNN-based Deep Residual Network,CDRN)进一步提升信道估计的准确性。数值结果表明,相较于基准方案,所提的信道估计方案在保持低信道估计开销的同时,得到了更准确的信道估计结果。 |
关键词: 可重构智能反射面 信道估计 超分辨网络 深度残差网络 |
DOI:10.20079/j.issn.1001-893x.231116005 |
|
基金项目:重庆市自然科学基金面上项目(cstc2021jcyj-msxmX0761);广西重点研发计划(AB24010317) |
|
An RIS Channel Estimation Method Based on Super-Resolution Networks |
GAN Chenquan,b,GUO Yuhang |
(a.School of Communication and Information Engineering;b.School of Cyber Security and Information Law,Chongqing University of Posts and Telecommunications,Chongqing 400065,China) |
Abstract: |
To precisely estimate the channel in reconfigurable intelligent surface(RIS)-assisted communication systems,and solve the issue of high channel estimation overhead,the authors propose a channel estimation scheme based on the fast super-resolution convolutional neural network(FSRCNN).In the initial phase of channel estimation,a subset of reflective elements is deactivated,and a limited number of pilot signals are utilized to conduct the channel estimation,and the estimation outcomes are treated as low-precision and low-resolution(LR) images.These LR images are subsequently upscaled to high-resolution(HR) images with low precision through linear interpolation.Subsequently,the precision of the estimation outcomes is enhanced using FSRCNN,and further elevated through the convolutional neural network-based deep residual network(CDRN).Numerical results demonstrate that,the proposed method achieves more accurate results while concurrently minimizing the associated computational overhead of low channel estimation compared with the baseline method. |
Key words: reconfigurable intelligent surface(RIS) channel estimation super-resolution network deep residual network |