期刊论文详细信息
SoftwareX
Symbolic DNN-Tuner: A Python and ProbLog-based system for optimizing Deep Neural Networks hyperparameters
Evelina Lamma1  Michele Fraccaroli2  Fabrizio Riguzzi2 
[1] Corresponding author.;Department of Engineering, University of Ferrara, Via Saragat 1, 44122 Ferrara, Italy;
关键词: Deep learning;    Probabilistic Logic Programming;    Hyper-parameters tuning;    Neural-symbolic integration;   
DOI  :  
来源: DOAJ
【 摘 要 】

The application of deep learning models to increasingly complex contexts has led to a rise in the complexity of the models themselves. Due to this, there is an increase in the number of hyper-parameters (HPs) to be set and Hyper-Parameter Optimization (HPO) algorithms occupy a fundamental role in deep learning. Bayesian Optimization (BO) is the state-of-the-art of HPO for deep learning models. BO keeps track of past results and uses them to build a probabilistic model, building a probability density of HPs. This work aims to improve BO applied to Deep Neural Networks (DNNs) by an analysis of the results of the network on training and validation sets. This analysis is obtained by applying symbolic tuning rules, implemented in Probabilistic Logic Programming (PLP). The resulting system, called Symbolic DNN-Tuner, logically evaluates the results obtained from the training and the validation phase and, by applying symbolic tuning rules, fixes the network architecture, and its HPs, leading to improved performance. In this paper, we present the general system and its implementation. We also show its graphical interface and a simple example of execution.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次