期刊论文详细信息
Journal of Computer Science
Adaptive Resonance Theory Training Parameters: Pretty Good Sets | Science Publications
Taisir M. Eldos1  Abdulaziz S. Almazyad1 
关键词: Adaptive resonance theory;    pretty good set;    artificial neural network;    optimization;   
DOI  :  10.3844/jcssp.2010.1443.1449
学科分类:计算机科学(综合)
来源: Science Publications
PDF
【 摘 要 】

Problem statement: ART1 artificial neural networks offer good tools for test clustering,where no expert is needed if the system is well trained. However, having no output reference for the inputpatterns makes it hard to judge the quality of the training. Moreover, the performance depends to a greatextent on a set of training parameters. Designers follow some recommendations or depend on theirexpertise in finding good sets with no performance guarantees. Many methods were proposed; fromgreedy methods offering quick and acceptable solutions to evolutionary algorithms offering suboptimalsets of parameters. While the evolutionary algorithms are a good choice for quality, the computationalcost is large even for an offline process; after all, computing resources are not for free. Approach: Weintroduced a method for selecting a set of parameters that yields a comparable performance and robustoperation, with relatively low cost compared to the evolutionary methods. This method located a suitableset through repetitive portioning of the range, by considering the best subset for the next iteration.Results: Tests have shown that performance comparable with the computationally intensive evolutionarymethods could be achieved in much less time. Conclusion: The repetitive portioning method for findinga good set of training parameters is very cost effective and yields good performance.

【 授权许可】

Unknown   

【 预 览 】
附件列表
Files Size Format View
RO201911300796897ZK.pdf 133KB PDF download
  文献评价指标  
  下载次数:8次 浏览次数:4次