Dear Participants and Speakers in the CBMS Short Course on Parallel Numerical Linear Algebra:
In preparation for this course, Peter Pacheco, the local organizer, sent a survey to all the participants. We asked detailed questions about many of topics, asking both for prior knowledge (from 1="a lot" to 4="none") and interest (from 1="a lot" to 5="can't tell"), as well as what their expectations are. Below we enclose two documents:
1) The survey form 2) Summarized results
Thanks to Peter Pacheco for the hard work in summarizing the responses!
The purpose of this survey is to help me prepare the lectures to better match your backgrounds. I expect students of possibly widely varying backgrounds, such as mathematics, computer science, engineering, physics, etc. The course material is quite interdisciplinary, touching on all these fields, so it is likely students will know a great deal about some parts of the course and less about others. The more I know about your backgrounds the better I can tune the lectures.
Thanks, Jim Demmel UC Berkeley --------------------------------------------------------------------------- Name: Institution: Mailing address: Status: (Ex: Faculty in dept X, 3rd year grad student in dept Y, etc.) Phone: Email: FAX: Relevant computing background (machines, languages used): Relevant mathematics background (numerical analysis, engineering, modeling, physics, etc.): Briefly describe your most ambitious (or parallel) programming project. Why do you want to take this class? Do you have a particular problem/application you'd like to parallelize? Please fill in the following table, indicating your familiarity with the listed topics, as well as your interest in learning more about them. We will use or cover some or all of these topics during the class, depending on class background and interest. Feel free to add short comments if my categories are too restrictive. In particular, if there is a missing topic you definitely want to hear about, please say so. Under Familiarity, put a number from 1 to 4 indicating Quite familiar - 1 Somewhat familiar - 2 Know what is is - 3 Unfamiliar - 4 Under Desire to know more, put a number from 1 to 4 indicating Definitely want to know more - 1 Interesting, but not highest priority - 2 Ok if you have time - 3 Not interesting - 4 Not sure - 5 Familiarity Desire to know more ----------------------------------------------------------------------------- UNIX Use of X-Windows Mosaic or Netscape ----------------------------------------------------------------------------- Computer block diagram Pipelining Vectorization Memory Hierarchy Cache Floating point arithmetic Race Condition ----------------------------------------------------------------------------- Graph algorithms, like DFS parallel versions Sorting algorithms parallel versions Parallel Prefix ----------------------------------------------------------------------------- Fortran C Matlab CM-Fortran or other data parallel language PVM or other message passing system Split-C Other parallel languages (which ones?) ----------------------------------------------------------------------------- CM-2 CM-5 Intel Paragon Cray C90 (or XMP or YMP) Cray T3D SGI, Sun or other parallel shared memory machines Other parallel machine (which ones?) ----------------------------------------------------------------------------- Abstract Models of parallel machines PRAM LogP others ----------------------------------------------------------------------------- Linpack Eispack BLAS LAPACK ScaLAPACK Netlib Parallelization tools (LPARx, PETsc, etc.; which ones?) ----------------------------------------------------------------------------- Numerical Stability Matrix Multiplication Blocking for the memory hierarchy Strassen's method ----------------------------------------------------------------------------- Gaussian Elimination partial pivoting Cholesky Gauss. Elim. for band matrices Gauss. Elim. for sparse matrices elimination tree supernodal algorithms (multi)frontal algorithms Parallel algorithms for any of the above (which ones?) ----------------------------------------------------------------------------- Linear least squares problems QR decomposition Householder transformations Givens transformations normal equations Gram-Schmidt process Modified Gram-Schmidt ----------------------------------------------------------------------------- Iterative Methods for Ax=b Jacobi Successive Overrelaxation Krylov subspace methods Conjugate Gradient Method GMRES Other (which ones?) Preconditioning Parallel algorithms for any of the above (which ones?) ----------------------------------------------------------------------------- Eigenvalues and Eigenvectors Of symmetric matrices Rayleigh quotient Tridiagonal reduction QR iteration Courant-Fisher Minimax Theorem Interlace Theorem Lanczos algorithm Bisection Inverse iteration Cuppen's method Trace minimization Jacobi's method Of nonsymmetric matrices Hessenberg reduction HQR algorithm Arnoldi algorithm Nonsymmetric Lanczos algorithm Sign-function Of pairs of matrices (or more general problems) Singular value decomposition (SVD) SVD of pairs of matrices (or more general problems) ----------------------------------------------------------------------------- Laplaces's or Poisson's equation Discretization using finite difference method Discretization using finite element method Solution using Jacobi or SOR Solution using Domain decomposition Solution using Multigrid Solution using Fast Fourier Transform (FFT) Fast Multipole Method, or Barnes-Hut -----------------------------------------------------------------------------
We did a quick review of the summary to see which (broad) topics the participants were especially interested in. The ones to which most them responded and in which the average interest was < 2 ("most interesting") follow.
Parallel sorting PVM or other message passing system Cray T3D Sun, SGI shared memory system ScalaPACK Parallelization tools Numerical stability Blocking for the memory hierarchy in matrix multiplication Sparse Gaussian elimination Parallel Gaussian elimination Preconditioning Parallel algorithms for iterative methods Symmetric and nonsymmetric eigenvalue problems Solution of the Poisson equation using domain decompostion, multigrid, FFT, and fast multipole
Applying the same criteria to familiarity came up with the following topics (i.e. most respondents are "quite familiar" or "somewhat familiar" with the following topics):
UNIX, X-windows, Mosaic Pipelining, Vectorization, Memory Hierarchy, Cache, Floating Point Fortran, C, Matlab Numerical Stability Partial Pivoting Cholesky factorization Gaussian elimination for band matrices Jacobi's method and SOR for iterative solution of Ax=b QR iteration Solution of the Poisson equation using finite differences
We received a total of 21 responses from the 34 participants (those who are not speaking.) Most of the respondents didn't answer all the questions. Participants by Institution Type (All participants other than speakers). University Mathematics: 12 Computer Science: 8 Engineering: 5 Physics: 4 Earth Science: 1 Government: 2 Industry: 2 Status (only respondents). Graduate student: 10 Undergraduate: 1 Faculty: 8 Post-doc: 1 Software Engineer: 1 Computing Backgrounds. (I've omitted information available elsewhere in the survey.) UNIX Workstations: 19 PC's and/or Macs: 7 Assembler: 2 Mathematics Backgrounds. This is tough to summarize. Lots of the responses didn't provide enough information to definitely identify coursework. Undergraduate Numerical Analysis: 3 Graduate Numerical Analysis: 14 Graduate Discrete Math: 1 Engineering Numerical Analysis: 2 PDE's: 1 Mathematical Physics: 1 Linear Algebra: 4 Applied Mathematics/Modeling: 2 Operations Research: 1 Numerical Linear Algebra: 2 Numerical Solution of PDE's: 3 Computational Fluid Dynamics: 2 Graduate Physics: 4 Sparse Matrix Computations: 1 Most ambitious programming project. Developed communications system for a parallel computer. Postscript interpreter for a shared memory machine. Port of a preconditioner to a parallel environment. Parallel software for linear algebra over finite fields. Added diagnostic routines to a 3D chemical tracer model. Developing boundary layer physics subroutine. Parallelized 1D and 2D wavelet transform using PVM. Developed parallel preconditioner for elliptic PDE solver. Code for generating random Wigner Matrices. Interior point method for linear programming. Fortran simulation of free boundary fluid flow. Developed a programming environment for irregular computations on parallel machines. Implementation of fast, low overhead distributed priority locks on an nCUBE 2 Ported general purpose 3D finite element code to Maspar and CM5. Parallelization of PDE solvers, domain decomposition. CG codes for networks of workstations. Parallel numerical linear algebra package. Parallel iterative solver. 3D adaptive domain decomposition. Monte Carlo simulation. Regression software. Why do you want to take this class? Learn about parallel applications. Help decide on course of graduate study. Adapt ideas from numerical linear algebra to discrete setting. Learn more about parallel computation. Develop improved representation of climate system. Solve systems of linear equations arising from PDE's on parallel machines. My main research interest is parallel numerical linear algebra. Upgrade my expertise in parallel computing and matrix computations. Work on wafer scale parallel algorithms. Learn about parallel numerical linear algebra. Need "starting point" to get abreast of current state of art in numerical mathematics. To reduce the computational costs of the numerical models I'm using. Widen my knowledge of parallel linear algebra. Learn state of the art in numerical linear algebra and parallel processing; improve my teaching of a similar course. Continuing education Help with current programming project. Help develop ideas for courses. Do you have a particular problem/application you'd like to parallelize? No. -- 9 responses Jacobi method for eigenvalue and singular value problems and generalizations. Parallelize chemical tracer model. Estimate eigenspace associated with largest eigenvalues of self-adjoint matrices Generate large normal random matrices. LU/Cholesky. Pivoting. Alternating direction implicit methods on shared and distributed memory machines. Eigenvalue problems. 3D technology CAD simulation software. Determinant Monte Carlo codes. ====================================================================== The means for "Interest" don't include the "5's" -- the "not sure's". ====================================================================== UNIX 0 5 10 15 20 +----+----+----+----+ 1 +XXXXXXXXXXXXXXXXXXX Familiarity 2 +X mean = 1.05 3 + 4 + 1 +XX Interest 2 +XX mean = 3.22 3 +XXXX 4 +XXXXXXXXXX 5 + ---------------------------------------------------------------------- Use of X-Windows 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXXXXX Familiarity 2 +XXXXXX mean = 1.43 3 + 4 +X 1 +XXX Interest 2 +XX mean = 3.10 3 +XXXXX 4 +XXXXXXXXXX 5 + ---------------------------------------------------------------------- Mosaic or Netscape 0 5 10 15 20 +----+----+----+----+ 1 +XXXXXXXXXXXXXXXXX Familiarity 2 +XXX mean = 1.24 3 +X 4 + 1 +XX Interest 2 + mean = 3.21 3 +XXXXXXXXX 4 +XXXXXXXX 5 + ====================================================================== Computer block diagram 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXXX mean = 2.26 3 +XXX 4 +XXXX 1 +XXXXXXX Interest 2 +XXX mean = 2.26 3 +XXXXXX 4 +XX 5 +X ---------------------------------------------------------------------- Pipelining 0 5 10 +----+----+ 1 +XXXXXXXXXX Familiarity 2 +XXXXX mean = 1.90 3 +XXXX 4 +XX 1 +XXXXXXX Interest 2 +XXXX mean = 2.21 3 +XXXXX 4 +XX 5 +X ---------------------------------------------------------------------- Vectorization 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XXXXXXXXX mean = 1.86 3 +XXX 4 +X 1 +XXXXXXXXXX Interest 2 +X mean = 2.00 3 +XXXXXX 4 +XX 5 + ---------------------------------------------------------------------- Memory Hierarchy 0 5 10 +----+----+ 1 +XXXXXXXXX Familiarity 2 +XXXXXX mean = 1.95 3 +XXXX 4 +XX 1 +XXXXXXXX Interest 2 +XXXX mean = 2.11 3 +XXXX 4 +XX 5 +X ---------------------------------------------------------------------- Cache 0 5 10 +----+----+ 1 +XXXXXXXXX Familiarity 2 +XXXXXX mean = 1.95 3 +XXXX 4 +XX 1 +XXXXXXXX Interest 2 +XXXXX mean = 2.00 3 +XXXX 4 +XX 5 + ---------------------------------------------------------------------- Floating point arithmetic 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXX Familiarity 2 +XXXXXXXXXX mean = 1.48 3 + 4 + 1 +XXXXXXXX Interest 2 +XXX mean = 2.26 3 +XXX 4 +XXXXX 5 + ---------------------------------------------------------------------- Race Condition 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XXXX mean = 2.76 3 +XXX 4 +XXXXXXXXX 1 +XXXXXXXX Interest 2 +XXXXX mean = 2.15 3 +XXX 4 +XXX 5 +X ====================================================================== Graph algorithms 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXX mean = 2.86 3 +XXXXXX 4 +XXXXXXXX 1 +XXXXXXXXX Interest 2 +XXX mean = 2.15 3 +XXXX 4 +XX 5 +XX ---------------------------------------------------------------------- Parallel graph algorithms 0 5 10 15 +----+----+----+ 1 +X Familiarity 2 +XX mean = 3.45 3 +XXXX 4 +XXXXXXXXXXXXX 1 +XXXXXXXX Interest 2 +XXXX mean = 2.05 3 +XXXXX 4 + 5 +XX ---------------------------------------------------------------------- Sorting algorithms 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXXXXXXX mean = 1.95 3 +XXXXX 4 + 1 +XXXXXXXX Interest 2 +XXX mean = 2.30 3 +XXXX 4 +XXXXX 5 + ---------------------------------------------------------------------- Parallel sorting algorithms 0 5 10 +----+----+ 1 +X Familiarity 2 +XXXX mean = 3.15 3 +XXXXXX 4 +XXXXXXXXX 1 +XXXXXXXXXX Interest 2 +XXX mean = 1.84 3 +XXXXX 4 +X 5 + ---------------------------------------------------------------------- Parallel prefix 0 5 10 15 +----+----+----+ 1 +XX Familiarity 2 +X mean = 3.33 3 +XXXXXX 4 +XXXXXXXXXXXX 1 +XXXXXXX Interest 2 +XX mean = 2.55 3 +XXXX 4 +X 5 +XXXXXX ====================================================================== Fortran 0 5 10 15 20 +----+----+----+----+ 1 +XXXXXXXXXXXXXXXXXX Familiarity 2 +XXX mean = 1.14 3 + 4 + 1 +XX Interest 2 +XX mean = 3.32 3 +XXX 4 +XXXXXXXXXXXX 5 + ---------------------------------------------------------------------- C 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXXXX Familiarity 2 +XXXXXX mean = 1.52 3 +X 4 +X 1 +XXXX Interest 2 +XX mean = 3.00 3 +XXX 4 +XXXXXXXXXX 5 + ---------------------------------------------------------------------- Matlab 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXXXXXXX mean = 1.95 3 +XXXXX 4 + 1 +XX Interest 2 +XXX mean = 3.00 3 +XXXXXXX 4 +XXXXXXX 5 + ---------------------------------------------------------------------- CM-Fortran or other data parallel language 0 5 10 +----+----+ 1 +XXX Familiarity 2 +X mean = 3.15 3 +XXXXXX 4 +XXXXXXXXXX 1 +XXXXXXXX Interest 2 +XX mean = 2.30 3 +XXXXXX 4 +XXX 5 +X ---------------------------------------------------------------------- PVM or other message-passing system 0 5 10 15 +----+----+----+ 1 +XX Familiarity 2 +XXXXXX mean = 2.81 3 +XXXXXXX 4 +XXXXXX 1 +XXXXXXXXXXX Interest 2 +XXXX mean = 1.95 3 +XX 4 +XX 5 +XX ---------------------------------------------------------------------- Split-C 0 5 10 15 20 +----+----+----+----+ 1 + Familiarity 2 + mean = 3.90 3 +XX 4 +XXXXXXXXXXXXXXXXXXX 1 +XXXXXXXX Interest 2 +XX mean = 2.45 3 +XXX 4 +XXX 5 +XXXX ---------------------------------------------------------------------- Other parallel languages 0 5 +----+ 1 +X Familiarity 2 +XX mean = 3.08 3 +XXXX 4 +XXXXX 1 +XXXX Interest 2 +XX mean = 2.42 3 +XXX 4 + 5 +XXX Languages mentioned: ZPL: 1 Overview: 1 Fortran 90: 1 HPF: 2 PICL: 1 ====================================================================== CM-2 0 5 10 +----+----+ 1 + Familiarity 2 +XXXXXX mean = 3.10 3 +XXXXXXX 4 +XXXXXXXX 1 +XXXXXX Interest 2 +X mean = 2.65 3 +XXXXXXX 4 +XXXXX 5 +X ---------------------------------------------------------------------- CM-5 0 5 10 +----+----+ 1 +XX Familiarity 2 +XXXX mean = 2.89 3 +XXXXXXX 4 +XXXXXX 1 +XXXXXX Interest 2 +XXX mean = 2.37 3 +XXXXXXX 4 +XX 5 +X ---------------------------------------------------------------------- Intel Paragon 0 5 10 +----+----+ 1 +X Familiarity 2 +XXXX mean = 3.00 3 +XXXXXXXX 4 +XXXXXX 1 +XXXXXXXXX Interest 2 +XX mean = 2.05 3 +XXXXXX 4 +X 5 +X ---------------------------------------------------------------------- Cray C90 (or XMP or YMP) 0 5 10 +----+----+ 1 + Familiarity 2 +XXXXXXXX mean = 2.75 3 +XXXXXXXXX 4 +XXX 1 +XXXXXXX Interest 2 +XXX mean = 2.25 3 +XXXXXXXX 4 +X 5 +X ---------------------------------------------------------------------- Cray T3D 0 5 10 +----+----+ 1 + Familiarity 2 +XXXXX mean = 3.11 3 +XXXXXXX 4 +XXXXXXX 1 +XXXXXXXXXX Interest 2 +XXXXX mean = 1.74 3 +XXX 4 + 5 +X ---------------------------------------------------------------------- SGI, Sun or other parallel shared memory machines 0 5 10 15 +----+----+----+ 1 +XXX Familiarity 2 +XXXXXXX mean = 2.56 3 +XXX 4 +XXXXX 1 +XXXXXXXXXXX Interest 2 +XXX mean = 1.72 3 +XX 4 +X 5 +X ---------------------------------------------------------------------- Other parallel machine 0 5 +----+ 1 +XXXX Familiarity 2 +XX mean = 2.36 3 +XX 4 +XXX 1 +XXXXX Interest 2 + mean = 2.00 3 +XXX 4 + 5 +X Machines Mentioned: Proteus: 1 IBM SP/x: 5 Workstation Cluster: 1 Convex: 1 Meiko CS2: 1 nCUBE: 1 KSR: 2 ====================================================================== Abstract Models of parallel machines 0 5 10 +----+----+ 1 + Familiarity 2 + mean = 4.00 3 + 4 +XXXXXXXX 1 +XXXXX Interest 2 +XX mean = 1.67 3 +XX 4 + 5 + ---------------------------------------------------------------------- PRAM 0 5 10 +----+----+ 1 +XX Familiarity 2 +XX mean = 3.22 3 +XXXX 4 +XXXXXXXXXX 1 +XXXX Interest 2 +XX mean = 2.50 3 +XXXXX 4 +XX 5 +X ---------------------------------------------------------------------- LogP 0 5 10 15 +----+----+----+ 1 + Familiarity 2 +XX mean = 3.56 3 +XXXX 4 +XXXXXXXXXXXX 1 +XXXXXX Interest 2 +X mean = 2.33 3 +XXXXX 4 +XX 5 +X ---------------------------------------------------------------------- Other abstract models 0 5 10 +----+----+ 1 + Familiarity 2 +X mean = 3.75 3 + 4 +XXXXXXX 1 +XXX Interest 2 + mean = 2.50 3 +XXX 4 +X 5 +X Other models mentioned: BSP: 1 ====================================================================== Linpack 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXX mean = 2.30 3 +XXXXXXXXXX 4 +X 1 +XXXXXX Interest 2 +XXX mean = 2.63 3 +XX 4 +XXXXXXXX 5 + ---------------------------------------------------------------------- Eispack 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XX mean = 2.45 3 +XXXXXXXXX 4 +XXX 1 +XXXXX Interest 2 +XX mean = 2.74 3 +XXXXX 4 +XXXXXX 5 +X ---------------------------------------------------------------------- BLAS 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XXXX mean = 2.25 3 +XXXXXX 4 +XXX 1 +XXXXX Interest 2 +XXXXX mean = 2.44 3 +XXX 4 +XXXXX 5 + ---------------------------------------------------------------------- LAPACK 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XXXX mean = 2.20 3 +XXXX 4 +XXXX 1 +XXXXX Interest 2 +XXXXX mean = 2.50 3 +XX 4 +XXXXXX 5 + ---------------------------------------------------------------------- ScaLAPACK 0 5 10 15 +----+----+----+ 1 + Familiarity 2 +XXXXX mean = 2.90 3 +XXXXXXXXXXXX 4 +XXX 1 +XXXXXXXXX Interest 2 +XXXXXXX mean = 1.80 3 +XXX 4 +X 5 + ---------------------------------------------------------------------- Netlib 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXX Familiarity 2 +X mean = 2.10 3 +XXX 4 +XXXXX 1 +XXXX Interest 2 +XX mean = 2.72 3 +XXXXXXX 4 +XXXXX 5 + ---------------------------------------------------------------------- Parallelization tools 0 5 10 15 +----+----+----+ 1 +XX Familiarity 2 +XX mean = 3.39 3 +X 4 +XXXXXXXXXXXXX 1 +XXXXXXXXXX Interest 2 +XXXX mean = 1.72 3 +XXX 4 + 5 +X Tools mentioned: PETSc: 3 PYRROS: 1 ====================================================================== Numerical Stability 0 5 10 +----+----+ 1 +XXXXXXXXX Familiarity 2 +XXXXXXX mean = 1.68 3 +XXX 4 + 1 +XXXXXXXXX Interest 2 +XXXXXXX mean = 1.59 3 + 4 +X 5 + ---------------------------------------------------------------------- Matrix Multiplication 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XXXXXXX mean = 1.58 3 + 4 + 1 +XXXXXX Interest 2 +XXXX mean = 1.55 3 +X 4 + 5 + ---------------------------------------------------------------------- Blocking for the memory hierarchy 0 5 10 15 +----+----+----+ 1 +XXXXX Familiarity 2 +XXX mean = 2.58 3 +XXXXXX 4 +XXXXX 1 +XXXXXXXXXXX Interest 2 +XXX mean = 1.65 3 +X 4 +XX 5 + ---------------------------------------------------------------------- Strassen's method 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXX mean = 2.95 3 +XX 4 +XXXXXXXXXX 1 +XXXXXX Interest 2 +XXXXXX mean = 2.06 3 +XXX 4 +XX 5 + ====================================================================== Gaussian Elimination 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXX mean = 1.40 3 + 4 + 1 +XXXX Interest 2 +XXX mean = 1.89 3 +X 4 +X 5 + ---------------------------------------------------------------------- partial pivoting 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXX Familiarity 2 +XXXXX mean = 1.56 3 +X 4 +X 1 +XXX Interest 2 +XXXXX mean = 2.53 3 +XXXXXX 4 +XXX 5 + ---------------------------------------------------------------------- Cholesky 0 5 10 +----+----+ 1 +XXXXXXXXXX Familiarity 2 +XXXXXXX mean = 1.50 3 +X 4 + 1 +XXXX Interest 2 +XXX mean = 2.59 3 +XXXXXX 4 +XXXX 5 + ---------------------------------------------------------------------- Gauss. Elim. for band matrices 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XXXXXXX mean = 1.89 3 +XXXXX 4 + 1 +XXXXX Interest 2 +XX mean = 2.50 3 +XXXXXXXX 4 +XXX 5 + ---------------------------------------------------------------------- Gauss. Elim. for sparse matrices 0 5 10 15 +----+----+----+ 1 +XX Familiarity 2 +XXXXXXXXXXX mean = 2.12 3 +XXXX 4 + 1 +XXXXXX Interest 2 +XXXXXXX mean = 1.88 3 +XX 4 +X 5 + ---------------------------------------------------------------------- elimination tree 0 5 10 +----+----+ 1 +XX Familiarity 2 +XXXXXX mean = 2.83 3 +XXX 4 +XXXXXXX 1 +XXXXX Interest 2 +XXXXXXXXX mean = 1.94 3 +XX 4 +X 5 + ---------------------------------------------------------------------- supernodal algorithms 0 5 10 +----+----+ 1 +X Familiarity 2 +XXXXXX mean = 3.06 3 +XX 4 +XXXXXXXXX 1 +XXXXX Interest 2 +XXXXXXXX mean = 2.06 3 +XX 4 +X 5 +X ---------------------------------------------------------------------- (multi)frontal algorithms 0 5 10 +----+----+ 1 +X Familiarity 2 +XXXXXXXX mean = 2.83 3 +XX 4 +XXXXXXX 1 +XXXXXXX Interest 2 +XXXXX mean = 2.00 3 +XXX 4 +X 5 +X ---------------------------------------------------------------------- Parallel algorithms for any of the above 0 5 10 15 +----+----+----+ 1 +XXX Familiarity 2 +XXX mean = 2.94 3 +XX 4 +XXXXXXXX 1 +XXXXXXXXXXXXX Interest 2 +XX mean = 1.25 3 +X 4 + 5 + Algorithms mentioned: Gaussian Elimination/Cholesky: 2 Sparse Gaussian Elimination: 5 Dense and banded Gauss. Elim: 1 ====================================================================== Linear least squares problems 0 5 10 +----+----+ 1 +XXX Familiarity 2 +XXX mean = 2.31 3 +XXXXXXX 4 + 1 +XXXXXX Interest 2 +XXX mean = 1.75 3 +XXX 4 + 5 + One respondent expressed interest in sparse least squares problems. ---------------------------------------------------------------------- QR decomposition 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXXXX mean = 2.00 3 +XX 4 +XX 1 +XXXXXX Interest 2 +XXX mean = 2.24 3 +XXXXXX 4 +XX 5 + ---------------------------------------------------------------------- Householder transformations 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XXXXX mean = 2.06 3 +XX 4 +XXX 1 +XXXXXX Interest 2 +XX mean = 2.35 3 +XXXXXX 4 +XXX 5 + ---------------------------------------------------------------------- Givens transformations 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XXXXX mean = 2.47 3 +X 4 +XXXXXX 1 +XXXXXXX Interest 2 +X mean = 2.24 3 +XXXXXXX 4 +XX 5 + ---------------------------------------------------------------------- normal equations 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXX mean = 2.24 3 +XX 4 +XXXX 1 +XXXXX Interest 2 +XX mean = 2.44 3 +XXXXXX 4 +XXX 5 + ---------------------------------------------------------------------- Gram-Schmidt process 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XXXX mean = 2.12 3 +XXX 4 +XXX 1 +XXXXX Interest 2 +XX mean = 2.47 3 +XXXXXXX 4 +XXX 5 + ---------------------------------------------------------------------- Modified Gram-Schmidt 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXXXXX mean = 2.53 3 +X 4 +XXXXXX 1 +XXXXX Interest 2 +XX mean = 2.47 3 +XXXXXXX 4 +XXX 5 + ====================================================================== Iterative Methods for Ax=b 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XXX mean = 1.82 3 +XXX 4 + 1 +XXXXXX Interest 2 +X mean = 1.80 3 +XX 4 +X 5 + ---------------------------------------------------------------------- Jacobi 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXXX Familiarity 2 +XXXX mean = 1.58 3 +XX 4 +X 1 +XXXXXXXX Interest 2 +XX mean = 2.28 3 +XXX 4 +XXXXX 5 + ---------------------------------------------------------------------- Successive Overrelaxation 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXXX Familiarity 2 +XX mean = 1.79 3 +XX 4 +XXX 1 +XXXXXXXXX Interest 2 +X mean = 2.26 3 +XXXX 4 +XXXXX 5 + ---------------------------------------------------------------------- Krylov subspace methods 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XX mean = 2.20 3 +XX 4 +XXXX 1 +XXXXXX Interest 2 +XXX mean = 2.07 3 +XXXXX 4 +X 5 + ---------------------------------------------------------------------- Conjugate Gradient Method 0 5 10 +----+----+ 1 +XXXXXXXXX Familiarity 2 +X mean = 2.12 3 +XXX 4 +XXXX 1 +XXXXXXXX Interest 2 +XX mean = 2.06 3 +XXXXX 4 +XX 5 + ---------------------------------------------------------------------- GMRES 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +X mean = 2.29 3 +XXX 4 +XXXXX 1 +XXXXXXX Interest 2 +XXX mean = 2.12 3 +XXXXX 4 +X 5 +X ---------------------------------------------------------------------- Other Krylov subspace methods 0 5 10 +----+----+ 1 +XXX Familiarity 2 +X mean = 2.29 3 +X 4 +XX 1 +XXXXXX Interest 2 +X mean = 1.14 3 + 4 + 5 + Other Krylov subspace methods: QMR: 2 CGS: 2 BiCGSTAB: 1 ---------------------------------------------------------------------- Preconditioning 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XXXXXXXX mean = 2.00 3 +XX 4 +XX 1 +XXXXXXXXXX Interest 2 +XXXXX mean = 1.72 3 +X 4 +XX 5 + ---------------------------------------------------------------------- Parallel algorithms for iterative methods 0 5 10 15 +----+----+----+ 1 +XXXX Familiarity 2 +X mean = 2.88 3 +XXXX 4 +XXXXXXX 1 +XXXXXXXXXXXX Interest 2 +XXXX mean = 1.35 3 +X 4 + 5 + Algorithms mentioned: GMRES: 1 Krylov subspace methods: 3 Preconditioners: 2 All: 2 Jacobi: 1 SOR: 1 Domain decomp. precond: 1 ====================================================================== Eigenvalues and Eigenvectors 0 5 10 +----+----+ 1 +XXX Familiarity 2 +XXXX mean = 2.00 3 +XXX 4 + 1 +XXXXXX Interest 2 +XXX mean = 1.50 3 +X 4 + 5 + ---------------------------------------------------------------------- Eigenvalues and Eigenvectors of symmetric matrices 0 5 +----+ 1 +XXX Familiarity 2 +XXX mean = 2.27 3 +XXXX 4 +X 1 +XXXXX Interest 2 +XXX mean = 1.91 3 +XX 4 +X 5 + ---------------------------------------------------------------------- Rayleigh quotient 0 5 10 +----+----+ 1 +XXXXXXX Familiarity 2 +XX mean = 2.33 3 +XXXXX 4 +XXXX 1 +XXXXXXX Interest 2 +XXXXX mean = 2.11 3 +XXX 4 +XXX 5 + ---------------------------------------------------------------------- Tridiagonal reduction 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XXXX mean = 2.06 3 +XXX 4 +XXX 1 +XXXXXXXX Interest 2 +XXX mean = 1.94 3 +XXXXX 4 +X 5 + ---------------------------------------------------------------------- QR iteration 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XXXXX mean = 1.94 3 +XXX 4 +XX 1 +XXXXXXX Interest 2 +XXXX mean = 2.06 3 +XXXX 4 +XX 5 + ---------------------------------------------------------------------- Courant-Fisher Minimax Theorem 0 5 10 +----+----+ 1 +XXX Familiarity 2 +XXXX mean = 2.78 3 +XXXXX 4 +XXXXXX 1 +XXXXXXX Interest 2 +XXXXX mean = 2.11 3 +XXX 4 +XX 5 +X ---------------------------------------------------------------------- Interlace Theorem 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXX mean = 2.78 3 +XXXX 4 +XXXXXXX 1 +XXXXXXX Interest 2 +XXXXX mean = 2.11 3 +XXX 4 +XX 5 +X ---------------------------------------------------------------------- Lanczos algorithm 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +X mean = 2.28 3 +XXXXX 4 +XXXX 1 +XXXXXXXX Interest 2 +XXXXX mean = 1.89 3 +XXXX 4 +X 5 + ---------------------------------------------------------------------- Bisection 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXX mean = 2.78 3 +XXXX 4 +XXXXXXX 1 +XXXXXXX Interest 2 +XXXXX mean = 2.11 3 +XXX 4 +XXX 5 + ---------------------------------------------------------------------- Inverse iteration 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XXXX mean = 2.50 3 +XXXX 4 +XXXXX 1 +XXXXXX Interest 2 +XXXXXX mean = 2.17 3 +XXX 4 +XXX 5 + ---------------------------------------------------------------------- Cuppen's method 0 5 10 15 +----+----+----+ 1 +X Familiarity 2 +XX mean = 3.39 3 +XXXX 4 +XXXXXXXXXXX 1 +XXXXXXX Interest 2 +XXXX mean = 2.22 3 +XXX 4 +XX 5 +XX ---------------------------------------------------------------------- Trace minimization 0 5 10 +----+----+ 1 +XX Familiarity 2 +X mean = 3.28 3 +XXXXX 4 +XXXXXXXXXX 1 +XXXXXX Interest 2 +XXXXX mean = 2.28 3 +XXX 4 +XX 5 +XX ---------------------------------------------------------------------- Jacobi's method 0 5 10 +----+----+ 1 +XXX Familiarity 2 +XXXXX mean = 2.61 3 +XXXXXX 4 +XXXX 1 +XXXXXXXX Interest 2 +XXXX mean = 2.06 3 +XXX 4 +XXX 5 + ---------------------------------------------------------------------- Eigenvalues and eigenvectors of nonsymmetric matrices 0 5 10 +----+----+ 1 +X Familiarity 2 +X mean = 2.78 3 +XXXXXX 4 +X 1 +XXXXX Interest 2 +XXX mean = 1.56 3 +X 4 + 5 + ---------------------------------------------------------------------- Hessenberg reduction 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XX mean = 2.17 3 +XXXXX 4 +XXX 1 +XXXXXXX Interest 2 +XXXX mean = 2.06 3 +XXXX 4 +XX 5 + ---------------------------------------------------------------------- HQR algorithm 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +X mean = 2.89 3 +XXX 4 +XXXXXXXXX 1 +XXXXXXXX Interest 2 +XXX mean = 2.00 3 +XXXX 4 +X 5 +X ---------------------------------------------------------------------- Arnoldi algorithm 0 5 10 +----+----+ 1 +XXXXX Familiarity 2 +XX mean = 2.74 3 +XXXXX 4 +XXXXXXX 1 +XXXXXXXXX Interest 2 +XXX mean = 1.94 3 +XX 4 +XX 5 +X ---------------------------------------------------------------------- Nonsymmetric Lanczos algorithm 0 5 10 +----+----+ 1 +XXX Familiarity 2 +XXXXX mean = 2.72 3 +XXXX 4 +XXXXXX 1 +XXXXXXXXX Interest 2 +XXXX mean = 1.89 3 +XXX 4 +XX 5 + ---------------------------------------------------------------------- Sign-function 0 5 10 15 +----+----+----+ 1 +X Familiarity 2 +X mean = 3.56 3 +XXX 4 +XXXXXXXXXXXXX 1 +XXXXXXX Interest 2 +XX mean = 2.33 3 +XXXXX 4 +XX 5 +XX ---------------------------------------------------------------------- Eigenvalues and eigenvectors of pairs of matrices (or more general problems) 0 5 10 +----+----+ 1 +X Familiarity 2 +XXXX mean = 3.00 3 +XXXXX 4 +XXXXXX 1 +XXXXXXXX Interest 2 +XXX mean = 1.94 3 +XXXXX 4 +X 5 + ---------------------------------------------------------------------- Singular value decomposition (SVD) 0 5 10 +----+----+ 1 +XXXXXX Familiarity 2 +XX mean = 2.35 3 +XXXXXX 4 +XXX 1 +XXXXXXX Interest 2 +XXX mean = 2.00 3 +XXXXX 4 +X 5 + ---------------------------------------------------------------------- SVD of pairs of matrices (or more general problems) 0 5 10 +----+----+ 1 +X Familiarity 2 +X mean = 3.27 3 +XXXXXX 4 +XXXXXXX 1 +XXXXX Interest 2 +XXXX mean = 2.13 3 +XXXXX 4 +X 5 + ====================================================================== Laplaces's or Poisson's equation 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XXXX mean = 1.90 3 +X 4 +X 1 +XXXXXXX Interest 2 +X mean = 1.70 3 + 4 +XX 5 + ---------------------------------------------------------------------- Discretization using finite difference method 0 5 10 15 +----+----+----+ 1 +XXXXXXXXXXX Familiarity 2 +XXX mean = 1.79 3 +XXX 4 +XX 1 +XXXXXXXX Interest 2 +XX mean = 2.37 3 +XXX 4 +XXXXXX 5 + ---------------------------------------------------------------------- Discretization using finite element method 0 5 10 15 +----+----+----+ 1 +XXXXXXX Familiarity 2 +XXXX mean = 2.21 3 +XXXXX 4 +XXX 1 +XXXXXXXXXXX Interest 2 +X mean = 2.11 3 +X 4 +XXXXXX 5 + ---------------------------------------------------------------------- Solution using Jacobi or SOR 0 5 10 +----+----+ 1 +XXXXXXXX Familiarity 2 +XXX mean = 2.17 3 +XXX 4 +XXXX 1 +XXXXXXXXXX Interest 2 +X mean = 2.06 3 +XXX 4 +XXXX 5 + ---------------------------------------------------------------------- Solution using Domain decomposition 0 5 10 15 +----+----+----+ 1 +XXXX Familiarity 2 +XX mean = 2.74 3 +XXXXXXXX 4 +XXXXX 1 +XXXXXXXXXXXXXXX Interest 2 +X mean = 1.47 3 +X 4 +XX 5 + ---------------------------------------------------------------------- Solution using Multigrid 0 5 10 15 +----+----+----+ 1 +XXXX Familiarity 2 +XXX mean = 2.74 3 +XXXXXX 4 +XXXXXX 1 +XXXXXXXXXXXXX Interest 2 +XXX mean = 1.58 3 +X 4 +XX 5 + ---------------------------------------------------------------------- Solution using Fast Fourier Transform (FFT) 0 5 10 +----+----+ 1 +XXXX Familiarity 2 +XX mean = 2.84 3 +XXXXXX 4 +XXXXXXX 1 +XXXXXXXXXX Interest 2 +XXX mean = 1.95 3 +XXX 4 +XX 5 +X ---------------------------------------------------------------------- Fast Multipole Method, or Barnes-Hut 0 5 10 15 +----+----+----+ 1 +XX Familiarity 2 +X mean = 3.26 3 +XXXXXX 4 +XXXXXXXXXX 1 +XXXXXXXXXXX Interest 2 +XXX mean = 1.89 3 +X 4 +XX 5 +XX