Oorsig | Overview |
---|---|
Die module fokus op numeriese metodes vir matriksbewerkings. Ons kyk na die effektiewe oplos van vierkantige lineêre stelsels, kleinste-kwadrate probleme, en die eiewaarde probleem. Direkte sowel as iteratiewe metodes word behandel, met die klem op yl matrikse en matrikse met struktuur. Slaggate soos numeriese onstabiliteit en sleg-geaardheid word uitgewys. Ons beskou modelprobleme uit parsiële differensiaalvergelykings en beeldverwerking. Teorie, algoritmiese aspekte en toepassings word in gelyke mate beklemtoon. | The module focuses on numerical methods for matrix computations. We look at the effective solution of square linear systems, least squares problems, and the eigenvalue problem. We consider direct as well as iterative methods, with special attention to sparse matrices and structured matrices. Pitfalls such as numerical instability and ill-conditioning are pointed out. Model problems are drawn from partial differential equations and image processing. Theory, algorithmic aspects, and applications are emphasised in equal measure. |
Test information
- The second test covers A&G Ch 6, 7 and the notes by Shewchuck.
- Suggested problems for Ch 6: A+G, Ex 6.4, # 0, 4, 9(b)
- Suggested problems for Ch 7: A+G, Ex 7.7, # 0, 3, 4, 6, 7, 8, 9, 10,
11, 16, 19.
- Hint for #9: Solved in Saad (link on the right), p. 116, Example 4.1. Note it is called Richardson iteration.
- Hint for #11: Damped Jacobi → M = D / ω
- Some problems from Trefethen & Bau.
- Additional problems from Leon's book.
Dosente / Lecturers
Dosent / Lecturer | Kantoor / Office | E-pos / Email |
---|---|---|
Prof. André Weideman | A315 | weideman at sun |
Dr Stéfan van der Walt | A314 | stefan at sun |
Lectures
Week | Topic (All section numbers refer to the book of Ascher & Greif) |
---|---|
1 | Review of undergraduate linear algebra (4.1). |
2 | Vector and matrix norms (4.2), special classes of matrices (4.3), singular values (4.4). |
3 | Gaussian Elimination and back substitution (5.1), LU decomposition (5.2) |
4 | Pivoting strategies (5.3), Cholesky decomposition (5.5) |
5 | Sparse matrices (5.6), permutations and ordering strategies (5.7), estimating errors and the condition number (5.8). |
6 | Least squares and the normal equations (6.1), orthogonal transformations and QR (6.2) |
7 | Test week |
8 | Vacation |
9 | Householder transformations and Gram-Schmidt orthogonalization (6.3) |
10 | Stationary iterative methods: Jacobi, Gauss-Seidel, SOR (7.1-73) |
11 | Method of Steepest Descent (7.4) |
12 | Methods of Conjugate Directions and Conjugate Gradients (7.4 + Shewchuck). See also the paper by Hestenes & Stiefel. |
13 | Convergence of Conjugate Gradients. |
14 | Other Krylov iterations (Arnoldi, Lanczos). |
Assignments
Nr | Topic | Hand-in |
---|---|---|
1 | Properties of the eigenvalues & vectors of square matrices, vector and matrix norms. (Solutions) | 17 February |
2 | Gaussian elimination, properties of matrices arising from the discretisation of PDEs (Solutions) | sptools.py | Please note that there are two 9-point discretisations of the Laplacian--both are outlined in the textbook by Saad and shown here; you can use either one. | 1 March |
3 | Cholesky factorisation, instability in Gaussian elimination, re-ordering strategies, ill-conditioning | LU with Complete Pivoting: (MATLAB | Python). The MATLAB code is provided by Nick Henderson, Stanford University. | 18 March28 March |
Test | Memo | |
4 | Least squares, Gram-Schmidt, Householder (Solutions) | 7 April |
5 | Stationary Iterative Methods, Optimal SOR Parameter, Preconditioners | 21 April |
6 | Conjugate Gradients, Preconditioned Conjugate Gradients, SSOR | 1216 May (hand in at A314) |
The following Python code may come in handy for specifying a pre-conditioner, P, since the argument to GMRES, PCG, etc. must be the inverse of the pre-conditioner (the operator M below): import scipy.sparse.linalg as spla P_inv = lambda x: spla.spsolve(P, x) M = spla.LinearOperator((n, n), P_inv) |
Golden Rules of Numerical Linear Algebra
- Never compute the matrix inverse.
- Never compute a determinant using co-factors.
- Never compute eigenvalues using the characteristic polynomial.
- Never compute QR using straight Gram-Schmidt (i.e., without modification).
- Never compute A^{T}A when solving least squares problems.
- Never compute the eigenvalues of A to determine whether it is symmetric positive definite. Rather compute the Cholesky factorisation; if successful, A is s.p.d.
- Never compute det(A) to decide if A is approximately singular.