2022 journal article

Newton's Method in Mixed Precision

SIAM REVIEW, 64(1), 191–211.

By: C. Kelley*

author keywords: Newton's method; mixed precision arithmetic; backward error; probabilistic rounding analysis
TL;DR: The important ideas in the paper are O notation, (cid:13)oating 13 point precision, backward error in linear solvers, and Newton’s method. (via Semantic Scholar)
UN Sustainable Development Goals Color Wheel
UN Sustainable Development Goal Categories
1. No Poverty (OpenAlex)
Source: ORCID
Added: February 3, 2022

We investigate the use of reduced precision arithmetic to solve the linear equation for the Newton step. If one neglects the backward error in the linear solve, then well-known convergence theory implies that using single precision in the linear solve has very little negative effect on the nonlinear convergence rate. However, if one considers the effects of backward error, then the usual textbook estimates are very pessimistic and even the state-of-the-art estimates using probabilistic rounding analysis do not fully conform to experiments. We report on experiments with a specific example. We store and factor Jacobians in double, single, and half precision. In the single precision case we observe that the convergence rates for the nonlinear iteration do not degrade as the dimension increases and that the nonlinear iteration statistics are essentially identical to the double precision computation. In half precision we see that the nonlinear convergence rates, while poor, do not degrade as the dimension increases.