From: lilia Date: Fri, 10 Oct 2014 09:08:23 +0000 (+0200) Subject: 10-10-2014 05 X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/GMRES2stage.git/commitdiff_plain/61de3472de119515c917449ea4aea0a3240c8cb0?ds=sidebyside 10-10-2014 05 --- diff --git a/paper.tex b/paper.tex index 2372b61..4cd16c3 100644 --- a/paper.tex +++ b/paper.tex @@ -626,8 +626,8 @@ inner solver. The current approximation of the Krylov method is then stored insi $S$ composed by the successive solutions that are computed during inner iterations. At each $s$ iterations, the minimization step is applied in order to -compute a new solution $x$. For that, the previous residuals are computed with -$(b-AS)$. The minimization of the residuals is obtained by +compute a new solution $x$. For that, the previous residuals of $Ax=b$ are computed by +the inner iterations with $(b-AS)$. The minimization of the residuals is obtained by \begin{equation} \underset{\alpha\in\mathbb{R}^{s}}{min}\|b-R\alpha\|_2 \label{eq:01} @@ -654,7 +654,7 @@ appropriate than a single direct method in a parallel context. \State $S_{k \mod s}=x^k$ \label{algo:store} \If {$k \mod s=0$ {\bf and} error$>\epsilon_{kryl}$} \State $R=AS$ \Comment{compute dense matrix} \label{algo:matrix_mul} - \State Solve least-square problem $\underset{\alpha\in\mathbb{R}^{s}}{min}\|b-R\alpha\|_2$ \label{algo:} + \State $\alpha=Solve\_Least\_Squares(R,b,max\_iter_{ls})$ \label{algo:} \State $x^k=S\alpha$ \Comment{compute new solution} \EndIf \EndFor