学位论文详细信息
Deep representation learning on hypersphere
Machine learning;Visual recognition;Deep learning;Representation learning;Neural networks;Hypersphere
Liu, Weiyang ; Song, Le Computational Science and Engineering Rehg, James Matthew Chau, Duen Horng Xie, Yao Raj, Bhiksha ; Song, Le
University:Georgia Institute of Technology
Department:Computational Science and Engineering
关键词: Machine learning;    Visual recognition;    Deep learning;    Representation learning;    Neural networks;    Hypersphere;   
Others  :  https://smartech.gatech.edu/bitstream/1853/63670/1/LIU-DISSERTATION-2020.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】

How to efficiently learn discriminative deep features is arguably one of the core problems in deep learning, since it can benefit a lot of downstream tasks such as visual recognition, object detection, semantic segmentation, etc. In this dissertation, we present a unified deep representation learning framework on hypersphere, which inherently introduces a novel hyperspherical inductive bias into deep neural networks. We show that our framework is well motivated from both empirical observations and theories. We discuss our framework from four distinct perspectives: (1) learning objectives on hypersphere; (2) neural architectures on hypersphere; (3) regularizations on hypersphere; (4) hyperspherical training paradigm. From the first three perspectives, we explain how we can utilize the idea of hyperspherical learning to revisit and reinvent corresponding components in deep learning. From the last perspective, we propose a general neural training framework that is heavily inspired by hyperspherical learning. We conduct comprehensive experiments on many applications to demonstrate that our deep hyperspherical learning framework yields better generalization, faster convergence and stronger adversarial robustness compared to the standard deep learning framework.

【 预 览 】
附件列表
Files Size Format View
Deep representation learning on hypersphere 33669KB PDF download
  文献评价指标  
  下载次数:4次 浏览次数:15次