Stochastic Controls - Hamiltonian Systems and HJB Equations (Paperback, Softcover reprint of the original 1st ed. 1999)

,
As is well known, Pontryagin's maximum principle and Bellman's dynamic programming are the two principal and most commonly used approaches in solving stochastic optimal control problems. * An interesting phenomenon one can observe from the literature is that these two approaches have been developed separately and independently. Since both methods are used to investigate the same problems, a natural question one will ask is the fol lowing: (Q) What is the relationship betwccn the maximum principlc and dy namic programming in stochastic optimal controls? There did exist some researches (prior to the 1980s) on the relationship between these two. Nevertheless, the results usually werestated in heuristic terms and proved under rather restrictive assumptions, which were not satisfied in most cases. In the statement of a Pontryagin-type maximum principle there is an adjoint equation, which is an ordinary differential equation (ODE) in the (finite-dimensional) deterministic case and a stochastic differential equation (SDE) in the stochastic case. The system consisting of the adjoint equa tion, the original state equation, and the maximum condition is referred to as an (extended) Hamiltonian system. On the other hand, in Bellman's dynamic programming, there is a partial differential equation (PDE), of first order in the (finite-dimensional) deterministic case and of second or der in the stochastic case. This is known as a Hamilton-Jacobi-Bellman (HJB) equation.

R5,387

Or split into 4x interest-free payments of 25% on orders over R50
Learn more

Discovery Miles53870
Mobicred@R505pm x 12* Mobicred Info
Free Delivery
Delivery AdviceShips in 10 - 15 working days



Product Description

As is well known, Pontryagin's maximum principle and Bellman's dynamic programming are the two principal and most commonly used approaches in solving stochastic optimal control problems. * An interesting phenomenon one can observe from the literature is that these two approaches have been developed separately and independently. Since both methods are used to investigate the same problems, a natural question one will ask is the fol lowing: (Q) What is the relationship betwccn the maximum principlc and dy namic programming in stochastic optimal controls? There did exist some researches (prior to the 1980s) on the relationship between these two. Nevertheless, the results usually werestated in heuristic terms and proved under rather restrictive assumptions, which were not satisfied in most cases. In the statement of a Pontryagin-type maximum principle there is an adjoint equation, which is an ordinary differential equation (ODE) in the (finite-dimensional) deterministic case and a stochastic differential equation (SDE) in the stochastic case. The system consisting of the adjoint equa tion, the original state equation, and the maximum condition is referred to as an (extended) Hamiltonian system. On the other hand, in Bellman's dynamic programming, there is a partial differential equation (PDE), of first order in the (finite-dimensional) deterministic case and of second or der in the stochastic case. This is known as a Hamilton-Jacobi-Bellman (HJB) equation.

Customer Reviews

No reviews or ratings yet - be the first to create one!

Product Details

General

Imprint

Springer-Verlag New York

Country of origin

United States

Series

Stochastic Modelling and Applied Probability, 43

Release date

September 2012

Availability

Expected to ship within 10 - 15 working days

First published

1999

Authors

,

Dimensions

235 x 155 x 24mm (L x W x T)

Format

Paperback

Pages

439

Edition

Softcover reprint of the original 1st ed. 1999

ISBN-13

978-1-4612-7154-3

Barcode

9781461271543

Categories

LSN

1-4612-7154-1



Trending On Loot