-The GMRES method uses the Arnoldi method~\cite{Arn51} to build an orthonormal basis $V_k$
-and a Hessenberg upper matrix $\bar{H}_k$ of order $(k+1)\times k$:
-\begin{equation}
-\begin{array}{lll}
- V_{k}=\{v_{1},v_{2},...,v_{k}\}, & \text{such that} & \forall k>1, v_{k}=A^{k-1}v_{1},
-\end{array}
-\end{equation}
-and
-\begin{equation}
- AV_{k} = V_{k+1}\bar{H}_{k}.
-\label{eq:02}
-\end{equation}
-Then at each iteration $k$, a solution $x_k$ is computed in the Krylov sub-space $\mathcal{K}_{k}$
-spanned by $V_k$ as follows:
-\begin{equation}
-\begin{array}{ll}
- x_{k} = x_{0} + V_{k}y, & y\in\mathbb{R}^{k}.
-\end{array}
-\label{eq:03}
-\end{equation}
-From both equations~(\ref{eq:02}) and~(\ref{eq:03}), we can deduce:
-\begin{equation}
-\begin{array}{lll}
- r_{k} & = & b - A (x_{0} + V_{k}y) \\
- & = & r_{0} - AV_{k}y \\
- & = & \beta v_{1} - V_{k+1}\bar{H}_{k}y \\
- & = & V_{k+1}(\beta e_{1} - \bar{H}_{k}y),
-\end{array}
-\end{equation}
-where $\beta=\|r_{0}\|_{2}$ and $e_{1}=(1,0,...,0)$ is the first vector of the canonical basis of
-$\mathbb{R}^{k}$. The vector $y$ is computed in $\mathbb{R}^{k}$ so as to minimize at best the Euclidean
-norm of the residual $r_{k}$. This means that the following least-squares problem of size $k$ must
-be solved:
-\begin{equation}
- \underset{y\in\mathbb{R}^{k}}{min}\|r_{k}\|_{2}=\underset{y\in\mathbb{R}^{k}}{min}\|\beta e_{1}-\bar{H}_{k}y\|_{2}.
-\end{equation}
-The solution of this problem is computed thanks to the QR factorization of the Hessenberg matrix
-$\bar{H}_{k}$ by using the Givens rotations~\cite{Saa03,Saa86} such that:
-\begin{equation}
-\begin{array}{lll}
- \bar{H}_{k}=Q_{k}R_{k}, & Q_{k}\in\mathbb{R}^{(k+1)\times (k+1)}, & R_{k}\in\mathbb{R}^{(k+1)\times k},
-\end{array}
-\end{equation}
-where $Q_{k}Q_{k}^{T}=I_{k}$ and $R_{k}$ is a upper triangular matrix.
-
-The GMRES method finds a solution at most after $n$ iterations. So, the Krylov sub-space dimension
-can increase up-to the large size $n$ of the linear system. Then in order to avoid a large storage
-of the orthonormal basis $V_{k}$, the Arnoldi process is restricted at $m\ll n$ iterations and restarted
-with the last iterate $x_{m}$ as an initial guess to the next iteration. Moreover, GMRES sometimes
-does not converge or takes too many iterations to satisfy the convergence criteria. Therefore, in
-most cases, GMRES must contain a preconditioning step to improve its convergence. The preconditioning
-technique replaces the system~(\ref{eq:01}) with the modified systems:
-\begin{equation}M^{-1}Ax=M^{-1}b.\end{equation}
-
-Algorithm~\ref{alg:01} illustrates the main key points of the GMRES method with restarts.
-The linear system to be solved in this algorithm is left-preconditioned where $M$ is the preconditioning
-matrix. At each iteration $k$, the Arnoldi process is used (from line~$7$ to line~$17$ of algorithm~\ref{alg:01})
-to construct an orthonormal basis $V_m$ and a Hessenberg matrix $\bar{H}_m$ of order $(m+1)\times m$.
-Then, the least-squares problem is solved (line~$18$) to find the vector $y\in\mathbb{R}^m$ which minimizes
-the residual. Finally, the solution $x_m$ is computed in the Krylov sub-space spanned by $V_m$ (line~$19$).
-In practice, the GMRES algorithm stops when the Euclidean norm of the residual is small enough and/or
-the maximum number of iterations is reached.