首页期刊简介编委会征稿启事出版道德声明审稿流程读者订阅论文查重联系我们English
引用本文
  • 赵立权,蔡帮贵.改进的扩展互信息分离算法[J].电讯技术,2013,53(4): - .    [点击复制]
  • ZHAO Li-quan,CAI Bang-gui.Extended mutual information separation algorithms based on multi-hidden layer[J].,2013,53(4): - .   [点击复制]
【打印本页】 【下载PDF全文】 查看/发表评论下载PDF阅读器关闭

←前一篇|后一篇→

过刊浏览    高级检索

本文已被:浏览 1791次   下载 1452 本文二维码信息
码上扫一扫!
改进的扩展互信息分离算法
赵立权,蔡帮贵
0
(东北电力大学 信息工程学院,吉林 132012)
摘要:
扩展互信息分离算法采用单隐层神经网络近似算法代价函数中的非线性函数,可调节 的参数有限,需要多次迭代才能收敛,从而导致收敛速度较慢。针对这一问题, 采用双隐层神经网络近似非线性函数,以分离结果的互信息最小化作为代价函数,采用梯度 下降方法对代价函数进行优化,增加了可调节参数数量。仿真实验结果表明,改进后的算法 相对原算法收敛速度更快,误差更小。
关键词:  非线性独立分量分析  扩展互信息分离算法  多层感知机  双隐层神经网络
DOI:
基金项目:吉林省科技发展计划项目(201101110)
Extended mutual information separation algorithms based on multi-hidden layer
ZHAO Li-quan,CAI Bang-gui
()
Abstract:
Extended mutual information separation(EMISEP) algorithm uses a single hidden layer neural network to approximate nonlinear function of cost function, so the adjustable parameter is limited and it needs more iteration times to converge, w hich leads to relatively slow convergence speed. To overcome this pr oblem, this paper uses double hidden layer perceptions to approximate nonlinear function of cost function, and uses mutual information minimum of separation sig nals as cost function, which is optimized by gradient descent method. This incre ases the number of adjustable parameters. The simulation results prove that the imp roved algorithm has faster convergence speed and smaller error comparing with t he original algorithm.
Key words:  nonlinear independent component analysis  extended mutual information separation algorithm  multilayer perception  double hidden layer neural network
安全联盟站长平台