Iterative Linear System Solvers

Solve large linear systems \( Ax = b \) using iterative approximation methods. These methods are memory efficient and work well for sparse matrices.

Jacobi Method

Simple iterative method using simultaneous updates

Convergence:Conditionally convergent
Parallelization:Excellent
Features:
  • Simultaneous updates
  • Simple implementation
  • Memory efficient
Update Rule:
x_i^{(k+1)} = (b_i - Σ a_{ij}x_j^{(k)}) / a_{ii}
Convergence Condition:

Diagonally dominant matrices

Use Jacobi Method
Gauss Seidel

Improved Jacobi method using sequential updates

Convergence:Faster than Jacobi
Parallelization:Limited
Features:
  • Sequential updates
  • Faster convergence
  • Uses latest values
Update Rule:
x_i^{(k+1)} = (b_i - Σ a_{ij}x_j^{(k+1)} - Σ a_{ij}x_j^{(k)}) / a_{ii}
Convergence Condition:

Positive definite or diagonally dominant

Use Gauss Seidel
Method Comparison
MethodMemoryConvergence RateParallelizableBest For
JacobiLowLinearYesLarge sparse systems
Gauss SeidelLowFaster LinearNoSequential processing
About Iterative Methods

Iterative methods solve linear systems by generating a sequence of approximations that converge to the exact solution. They are particularly useful for:

  • Large sparse systems: Memory efficient storage
  • Approximate solutions: Can stop when desired accuracy reached
  • Parallel computing: Some methods parallelize well
  • Preconditioned systems: Can be combined with preconditioning