Access Type

Open Access Dissertation

Date of Award

January 2015

Degree Type


Degree Name




First Advisor

Zhimin Zhang


Recovery techniques are important post-processing methods to obtain improved approximate solutions from primary data with reasonable cost. The practical us- age of recovery techniques is not only to improve the quality of approximation, but also to provide an asymptotically exact posteriori error estimators for adaptive meth- ods. This dissertation presents recovery techniques for nonconforming finite element methods and high order derivative as well as applications of gradient recovery.

Our first target is to develop a systematic gradient recovery technique for Crouzeix- Raviart element. The proposed method uses finite element solution to build a better approximation of the exact gradient based on local least square fittings. Due to poly- nomial preserving property of least square fitting, it is easy to show that the new proposed method preserves quadratic polynomials. In addition, the proposed gra- dient recovery is linearly bounded. Numerical tests indicate the recovered gradient is superconvergent to the exact gradient for both second order elliptic equation and Stokes equation. The gradient recovery technique can be used in a posteriori error

estimates for Crouzeix-Raviart element, which is relatively simple to implement and problem independent.

Our second target is to propose and analyze a new effective Hessian recovery for continuous finite element of arbitrary order. The proposed Hessian recovery is based on polynomial preserving recovery. The proposed method preserves polynomials of degree (k + 1) on general unstructured meshes and polynomials of degree (k + 2) on translation invariant meshes. Based on it polynomial preserving property, we can able to prove superconvergence of the proposed method on mildly structured meshes. In addition, we establish the ultraconvergence result for the new Hessian recovery technique on translation invariant finite element space of arbitrary order.

Our third target is to demonstrate application of gradient recovery in eigenvalue computation. We propose two superconvergent two-grid methods for elliptic eigen- value problems by taking advantage of two-gird method, two-space method, shifted- inverse power method, and gradient recovery enhancement. Theoretical and numer- ical results reveal that the proposed methods provide superconvergent eigenfunction approximation and ultraconvergent eigenvalue approximation. In addition, two mul- tilevel adaptive methods based recovery type a posterior error estimate are proposed.

Included in

Mathematics Commons