期刊论文详细信息
Frontiers in Robotics and AI
Statistical Relational Learning With Unconventional String Models
Herbert G. Tanner1  Michael Sebok1  Ashkan Zehfroosh1  Mai H. Vu2  Kristina Strother-Garcia2  Jeffrey Heinz3 
[1] Cooperative Robotics Lab, Department of Mechanical Engineering, University of Delaware, Newark, DE, United States;Department of Linguistics and Cognitive Science, University of Delaware, Newark, DE, United States;Department of Linguistics and Institute of Advanced Computational Science, Stony Brook University, Stony Brook, NY, United States;
关键词: statistical relational learning;    Markov logic networks;    grammatical inference;    formal language theory;    model theory;    phonology;   
DOI  :  10.3389/frobt.2018.00076
来源: DOAJ
【 摘 要 】

This paper shows how methods from statistical relational learning can be used to address problems in grammatical inference using model-theoretic representations of strings. These model-theoretic representations are the basis of representing formal languages logically. Conventional representations include a binary relation for order and unary relations describing mutually exclusive properties of each position in the string. This paper presents experiments on the learning of formal languages, and their stochastic counterparts, with unconventional models, which relax the mutual exclusivity condition. Unconventional models are motivated by domain-specific knowledge. Comparison of conventional and unconventional word models shows that in the domains of phonology and robotic planning and control, Markov Logic Networks With unconventional models achieve better performance and less runtime with smaller networks than Markov Logic Networks With conventional models.

【 授权许可】

Unknown   

  文献评价指标  
  下载次数:0次 浏览次数:0次