期刊论文详细信息
Advances in Difference Equations
Convergence analysis of gradient-based iterative algorithms for a class of rectangular Sylvester matrix equations based on Banach contraction principle
article
Kittisopaporn, Adisorn1  Chansangiam, Pattrawut1  Lewkeeratiyutkul, Wicharn2 
[1] Department of Mathematics, Faculty of Science, King Mongkut’s Institute of Technology Ladkrabang;Department of Mathematics and Computer Science, Faculty of Science, Chulalongkorn University
关键词: Generalized Sylvester matrix equation;    Gradient;    Linear difference vector equation;    Banach contraction principle;    Kronecker product;    Matrix norms;   
DOI  :  10.1186/s13662-020-03185-9
学科分类:航空航天科学
来源: SpringerOpen
PDF
【 摘 要 】

We derive an iterative procedure for solving a generalized Sylvester matrix equation$AXB+CXD = E$ , where$A,B,C,D,E$ are conforming rectangular matrices. Our algorithm is based on gradients and hierarchical identification principle. We convert the matrix iteration process to a first-order linear difference vector equation with matrix coefficient. The Banach contraction principle reveals that the sequence of approximated solutions converges to the exact solution for any initial matrix if and only if the convergence factor belongs to an open interval. The contraction principle also gives the convergence rate and the error analysis, governed by the spectral radius of the associated iteration matrix. We obtain the fastest convergence factor so that the spectral radius of the iteration matrix is minimized. In particular, we obtain iterative algorithms for the matrix equation$AXB=C$ , the Sylvester equation, and the Kalman–Yakubovich equation. We give numerical experiments of the proposed algorithm to illustrate its applicability, effectiveness, and efficiency.

【 授权许可】

CC BY   

【 预 览 】
附件列表
Files Size Format View
RO202108070004664ZK.pdf 2854KB PDF download
  文献评价指标  
  下载次数:2次 浏览次数:0次