期刊论文详细信息
卷:317
Explainable AI tools for legal reasoning about cases: A study on the European Court of Human Rights
Article
关键词: ARTIFICIAL-INTELLIGENCE;    ARGUMENTATION;    DIMENSIONS;    MODEL;   
DOI  :  10.1016/j.artint.2023.103861
来源: SCIE
【 摘 要 】
In this paper we report on a significant research project undertaken to design, implement and evaluate explainable decision-support tools for deciding legal cases. We provide a model of a legal domain, Article 6 of the European Convention on Human Rights, constructed using a methodology from the field of computational models of argument. We describe how the formal model has been developed, extended and transformed into practical tools, which were then used in evaluation exercises to determine the effectiveness and usability of the tools. The underpinning AI techniques used yield a level of explanation that is firmly grounded in legal reasoning and is also digestible by the target end users, as demonstrated through our evaluation activities. The results of our experimental evaluation show that on the first pass, our tool achieved an accuracy rate of 97% in matching the actual decisions of the cases and the user studies conducted gave highly encouraging results with respect to usability. As such, our project demonstrates how trustworthy AI tools can be built for a real world legal domain where critical needs of the end users are accounted for.(c) 2023 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons .org /licenses /by /4 .0/).
【 授权许可】

Free   

  文献评价指标  
  下载次数:0次 浏览次数:0次