期刊论文详细信息
International Journal of Image Processing
Comparison Between Levenberg-Marquardt And Scaled Conjugate Gradient Training Algorithms For Image Compression Using MLP
Devesh Batra1 
[1] $$
关键词: Image Compression;    Artificial Neural Network;    Multilayer Perceptron;    Training;    Levenberg-Marquardt;    Scaled Conjugate Gradient;    Complexity.;   
DOI  :  
来源: Computer Science Journals
PDF
【 摘 要 】

The Internet paved way for information sharing all over the world decades ago and its popularity for distribution of data has spread like a wildfire ever since. Data in the form of images, sounds, animations and videos is gaining users’ preference in comparison to plain text all across the globe. Despite unprecedented progress in the fields of data storage, computing speed and data transmission speed, the demands of available data and its size (due to the increase in both, quality and quantity) continue to overpower the supply of resources. One of the reasons for this may be how the uncompressed data is compressed in order to send it across the network. This paper compares the two most widely used training algorithms for multilayer perceptron (MLP) image compression – the Levenberg-Marquardt algorithm and the Scaled Conjugate Gradient algorithm. We test the performance of the two training algorithms by compressing the standard test image (Lena or Lenna) in terms of accuracy and speed. Based on our results, we conclude that both algorithms were comparable in terms of speed and accuracy. However, the Levenberg- Marquardt algorithm has shown slightly better performance in terms of accuracy (as found in the average training accuracy and mean squared error), whereas the Scaled Conjugate Gradient algorithm faired better in terms of speed (as found in the average training iteration) on a simple MLP structure (2 hidden layers).

【 授权许可】

Unknown   

【 预 览 】
附件列表
Files Size Format View
RO201912040511289ZK.pdf 183KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:5次