Document Type

Article

Abstract

Some estimates for the approximation of optimal stochastic control problems by discrete time problems are obtained. In particular an estimate for the solutions of the continuous time versus the discrete time Hamilton-Jacobi-Bellman equations is given. The technique used is more analytic than probabilistic.

Disciplines

Numerical Analysis and Computation | Probability

Comments

Copyright © 1989 Society for Industrial and Applied Mathematics.

Share

COinS