Works (4)

Updated: July 5th, 2023 15:57

2018 article

Iterative Solution of Sparse Linear Least Squares using LU Factorization

PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING IN ASIA-PACIFIC REGION (HPC ASIA 2018), pp. 47–53.

By: G. Howell n & M. Baboulin*

TL;DR: This paper explores a further preconditioning by inv(L1) where L1 is the n × n upper part of the lower trapezoidal m × n factor L, and determines whether the iteration will be effective, and whether further pre-conditioning is required. (via Semantic Scholar)
Source: Web Of Science
Added: January 21, 2019

2016 article

LU Preconditioning for Overdetermined Sparse Least Squares Problems

PARALLEL PROCESSING AND APPLIED MATHEMATICS, PPAM 2015, PT I, Vol. 9573, pp. 128–137.

By: G. Howell n & M. Baboulin*

author keywords: Sparse linear least squares; Iterative methods; Preconditioning; Conjugate gradient algorithm; lsqr algorithm
TL;DR: This work investigates how to use an LU factorization with the classical lsqr routine for solving overdetermined sparse least squares problems because usually L is much better conditioned than A and iterating with L instead of A results in faster convergence. (via Semantic Scholar)
Source: Web Of Science
Added: August 6, 2018

2008 journal article

Cache efficient bidiagonalization using BLAS 2.5 operators

ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 34(3).

By: G. Howell n, J. Demmel*, C. Fulton*, S. Hammarling* & K. Marmol

author keywords: algorithms; performance; BLAS 2.5; bidiagonalization; cache-efficient; Householder reflections; matrix factorization; singular values; SVD
TL;DR: This paper reorganizes the sequence of operations for Householder bidiagonalization of a general m × n matrix, so that two (_GEMV) vector-matrix multiplications can be done with one pass of the unreduced trailing part of the matrix through cache. (via Semantic Scholar)
Source: Web Of Science
Added: August 6, 2018

2005 journal article

Algorithm 841: BHESS: Gaussian reduction to a similar banded Hessenberg form

ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 31(1), 166–185.

By: G. Howell n & N. Diaa

author keywords: algorithms; Gaussian similarity transformations; matrix eigenvalues; spectra; Hessenberg form; cache-efficient; Sylvester equation
TL;DR: BHESS uses Gaussian similarity transformations to reduce a general real square matrix to similar upper Hessenberg form to determine a complete spectrum in about one-fifth the time required for orthogonal reduction to Hessenbergform followed by QR iterations. (via Semantic Scholar)
Source: Web Of Science
Added: August 6, 2018

Citation Index includes data from a number of different sources. If you have questions about the sources of data in the Citation Index or need a set of data which is free to re-distribute, please contact us.

Certain data included herein are derived from the Web of Science© and InCites© (2024) of Clarivate Analytics. All rights reserved. You may not copy or re-distribute this material in whole or in part without the prior written consent of Clarivate Analytics.