期刊论文详细信息
Computer Science and Information Systems
Superior Performance of Using Hyperbolic Sine Activation Functions in ZNN Illustrated via Time-Varying Matrix Square Roots Finding
Yunong Zhang1 
[1] School of Information Science and Technology, Sun Yat-sen University
关键词: Zhang neural network;    global exponential convergence;    hyperbolic sine activation functions;    time-varying matrix square roots;    implementation errors;   
DOI  :  10.2298/CSIS120121043Z
学科分类:社会科学、人文和艺术(综合)
来源: Computer Science and Information Systems
PDF
【 摘 要 】

A special class of recurrent neural network (RNN), termed Zhang neural network (ZNN) depicted in the implicit dynamics, has recently been proposed for online solution of time-varying matrix square roots. Such a ZNNmodel can be constructed by using monotonically-increasing odd activation functions to obtain the theoretical time-varying matrix square roots in an error-free manner. Different choices of activation function arrays may lead to different performance of the ZNN model. Generally speaking, ZNN model using hyperbolic sine activation functions may achieve better performance, as compared with those using other activation functions. In this paper, to pursue the superior convergence and robustness properties, hyperbolic sine activation functions are applied to the ZNN model for online solution of time-varying matrix square roots. Theoretical analysis and computer-simulation results further demonstrate the superior performance of the ZNN model using hyperbolic sine activation functions in the context of large model-implementation errors, in comparison with that using linear activation functions.

【 授权许可】

CC BY-NC-ND   

【 预 览 】
附件列表
Files Size Format View
RO201904021858799ZK.pdf 263KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:0次