On the application of minimum principle for solving partially observable risk-sensitive control problems
Date
1996Source
Systems and Control LettersVolume
27Issue
3Pages
169-179Google Scholar check
Keyword(s):
Metadata
Show full item recordAbstract
This paper is concerned with the application of a minimum principle derived for general nonlinear partially observable exponential-of-integral control problems, to solve linear-exponential-quadratic-Gaussian problems. This minimum principle is the stochastic analog of Pontryagin's minimum principle for deterministic systems. It consists of an information state equation, an adjoint process governed by a stochastic partial differential equation with terminal condition, and a Hamiltonian functional. Two methods are employed to obtain the optimal control law. The first method appeals to the well-known approach of completing the squares, by first determining the optimal control law that minimizes the Hamiltonian functional. The second method provides significant insight into relations with the Hamilton-Jacobi approach associated with completely observable exponential-of-integral control problems. These methods of solution are particularly attractive because they do not assume a certainty equivalence principle, hence they can be used to solve nonlinear problems as well.