Some estimates for the approximation of optimal stochastic control problems by discrete time problems are obtained. In particular an estimate for the solutions of the continuous time versus the discrete time Hamilton-Jacobi-Bellman equations is given. The technique used is more analytic than probabilistic.
Numerical Analysis and Computation | Probability
J.-L. Menaldi, Some estimates for finite difference approximations, SIAM J. Control Optim., 27 (1989), pp. 579-607. doi: 10.1137/0327031