学位论文详细信息
Stochastic nonlinear control: A unified framework for stability, dissipativity, and optimality
Stochastic dissipativity;Markov diffusion processes;Extended Kalman-Yakubovich-Popov conditions;Stochastic stability of feedback systems;Stochastic semistability;Lyapunov theory;Converse Lyapunov theorems;Stochastic finite-time stability;Partial stochastic stability;Finite-time stabilization;Partial-state stabilization;Lyapunov differential inequalities;Stochastic optimal control;Stochastic Hamilton-Jacobi-Bellman theory;Time-varying systems;Stochastic differential games;Inverse optimal control;Stochastic Hamilton-Jacobi-Isaacs equation;Polynomial cost functions;Multilinear forms
Rajpurohit, Tanmay ; Haddad, Wassim M. Aerospace Engineering Vazirani, Vijay V. Verriest, Erik I. Theodorou, Evangelos Prasad, J. V. R. ; Haddad, Wassim M.
University:Georgia Institute of Technology
Department:Aerospace Engineering
关键词: Stochastic dissipativity;    Markov diffusion processes;    Extended Kalman-Yakubovich-Popov conditions;    Stochastic stability of feedback systems;    Stochastic semistability;    Lyapunov theory;    Converse Lyapunov theorems;    Stochastic finite-time stability;    Partial stochastic stability;    Finite-time stabilization;    Partial-state stabilization;    Lyapunov differential inequalities;    Stochastic optimal control;    Stochastic Hamilton-Jacobi-Bellman theory;    Time-varying systems;    Stochastic differential games;    Inverse optimal control;    Stochastic Hamilton-Jacobi-Isaacs equation;    Polynomial cost functions;    Multilinear forms;   
Others  :  https://smartech.gatech.edu/bitstream/1853/59856/1/RAJPUROHIT-DISSERTATION-2018.pdf
美国|英语
来源: SMARTech Repository
PDF
【 摘 要 】

In this work, we develop connections between stochastic stability theory and stochastic optimal control. In particular, first we develop Lyapunov and converse Lyapunov theorems for stochastic semistable nonlinear dynamical systems. Semistability is the property whereby the solutions of a stochastic dynamical system almost surely converge to (not necessarily isolated) Lyapunov stable in probability equilibrium points determined by the system initial conditions. Then we develop a unified framework to address the problem of optimal nonlinear analysis and feedback control for nonlinear stochastic dynamical systems. Specifically, we provide a simplified and tutorial framework for stochastic optimal control and focus on connections between stochastic Lyapunov theory and stochastic Hamilton-Jacobi-Bellman theory. In particular, we show that asymptotic stability in probability of the closed-loop nonlinear system is guaranteed by means of a Lyapunov function which can clearly be seen to be the solution to the steady-state form of the stochastic Hamilton-Jacobi-Bellman equation, and hence, guaranteeing both stochastic stability and optimality. Moreover, extensions to stochastic finite-time and partial-state stability and optimal stabilization are also addressed. Finally, we extended the notion of dissipativity theory for deterministic dynamical systems to controlled Markov diffusion processes and show the utility of the general concept of dissipation for stochastic systems.

【 预 览 】
附件列表
Files Size Format View
Stochastic nonlinear control: A unified framework for stability, dissipativity, and optimality 3416KB PDF download
  文献评价指标  
  下载次数:9次 浏览次数:39次