Theory of impurity-induced infrared absorption in cubic crystals

S. S. Jaswal, University of Nebraska
J. M. Wadehra, University of Nebraska

Abstract

A method to calculate the infrared absorption due to a very low concentration of defects in a diatomic cubic crystal is developed directly from the basic absorption equation in quantum mechanics when the impurity produces changes in mass and short-range force constants. It is shown that the absorption is due to the modes of T 1u symmetry about the defect and is proportional to the square of the projection of the amplitudes of the ions in the defect space onto the transverse-optic modes at the zone center as determined by the perturbation. A procedure to calculate the amplitudes of the ion in the defect space for a given mode is outlined. The present method gives more physical insight into the problem than most of the Green's-function formalisms used in the field.