Iterative Methods - Jacobi, Gauss-Seidel, Power, and SOR

In many computer and communication system models, the probability transition matrix is very large but also sparse. In this case, the iterative methods are particularly attractive, since they preserve the sparseness of the matrix. In these methods, a sequence of approximate values $\pi^{(k)}$ for the stationary probability vector is generated, which should converge to the solution $\pi$. Each iteration of the method has a cost approximately equal to the cost of multiplying a vector by a (sparse) matrix. Therefore, the number of iterations is crucial in determining the total cost of the algorithm.

The Jacobi, Gauss-Seidel, and Successive Over-Relaxation (SOR) methods are classical solution methods for a linear system. These and the Power method are used to solve the system $\pi=\pi P$, where $P$ is a stochastic matrix. The interface for iterative methods is shown in Figure [*].

Figure: The Stationary Iterative Methods.
\includegraphics[width=4in]{figuras/analyticalstationaryiterative.eps}

Two parameters have to be defined for these methods: the precision and the maximum number of iterations. The maximum number of iterations is used as a stopping condition, if the convergence is slow. If the method does not converge in ``max number of iterations'', it stops running and the user is notified.

Guilherme Dutra Gonzaga Jaime 2010-10-27