We consider the solution of a stochastic integral control problem, and we study its regularity. In particular, we characterize the optimal cost as the maximum solution of ∀v ∈ V, A(v)u ≤ ƒ(v) in D'(Ο), u = 0 on ∂Ο, u ∈ W1,∞(Ο),
where A(v) is a uniformly elliptic second order operator and V is the set of the values of the control.
Numerical Analysis and Computation
P.-L. Lions and J.-L. Menaldi, Optimal control of stochastic integrals and Hamilton-Jacobi-Bellman equations, II, SIAM J. Control Optim., 20 (1982), pp. 82-95. doi: 10.1137/0320007