Document Type
Article
Abstract
Some estimates for the approximation of optimal stochastic control problems by discrete time problems are obtained. In particular an estimate for the solutions of the continuous time versus the discrete time Hamilton-Jacobi-Bellman equations is given. The technique used is more analytic than probabilistic.
Disciplines
Numerical Analysis and Computation | Probability
Recommended Citation
J.-L. Menaldi, Some estimates for finite difference approximations, SIAM J. Control Optim., 27 (1989), pp. 579-607. doi: 10.1137/0327031
COinS
Comments
Copyright © 1989 Society for Industrial and Applied Mathematics.