\end{array}
\label{ch12:eq:13}
\end{equation}
-GMRES uses the Arnoldi process~\cite{ch12:ref5}\index{iterative method!Arnoldi process} to construct an
+GMRES uses the Arnoldi iterations~\cite{ch12:ref5}\index{iterative method!Arnoldi iterations} to construct an
orthonormal basis $V_k$ for the Krylov subspace $\mathcal{K}_k$ and an upper Hessenberg matrix\index{Hessenberg matrix}
$\bar{H}_k$ of order $(k+1)\times k$:
\begin{equation}
Algorithm~\ref{ch12:alg:02} shows the key points of the GMRES method with restarts.
It solves the left-preconditioned\index{sparse linear system!preconditioned} sparse linear
system~(\ref{ch12:eq:11}), such that $M$ is the preconditioning matrix. At each iteration
-$k$, GMRES uses the Arnoldi process\index{iterative method!Arnoldi process} (defined from
+$k$, GMRES uses the Arnoldi iterations\index{iterative method!Arnoldi iterations} (defined from
line~$7$ to line~$17$) to construct a basis $V_m$ of $m$ orthogonal vectors and an upper
Hessenberg matrix\index{Hessenberg matrix} $\bar{H}_m$ of size $(m+1)\times m$. Then, it
solves the linear least-squares problem of size $m$ to find the vector $y\in\mathbb{R}^{m}$
All tests are made on double-precision floating point operations. The parameters of both linear
solvers are initialized as follows: the residual tolerance threshold $\varepsilon=10^{-12}$, the
maximum number of iterations $maxiter=500$, the right-hand side $b$ is filled with $1.0$, and the
-initial guess $x_0$ is filled with $0.0$. In addition, we limited the Arnoldi process\index{iterative method!Arnoldi process}
+initial guess $x_0$ is filled with $0.0$. In addition, we limited the Arnoldi iterations\index{iterative method!Arnoldi iterations}
used in the GMRES method to $16$ iterations ($m=16$). For the sake of simplicity, we have chosen
the preconditioner $M$ as the main diagonal of the sparse matrix $A$. Indeed, it allows us to easily
compute the required inverse matrix $M^{-1}$, and it provides a relatively good preconditioning for