The present paper is organized as follows. In Section~\ref{sec:02} some related
works are presented. Section~\ref{sec:03} presents our two-stage algorithm using
a least-square residual minimization. Section~\ref{sec:04} describes some
-convergence results on this method. In Section~\ref{sec:05}, parallization
-details of TSARM are given. Section~\ref{sec:06} shows some experimental
+convergence results on this method. Section~\ref{sec:05} shows some experimental
results obtained on large clusters of our algorithm using routines of PETSc
toolkit. Finally Section~\ref{sec:06} concludes and gives some perspectives.
%%%*********************************************************
\item $\epsilon_{ls}$ the threshold to stop the least-square method
\end{itemize}
-%%%*********************************************************
-%%%*********************************************************
-
-\section{Convergence results}
-\label{sec:04}
-
-
-
-%%%*********************************************************
-%%%*********************************************************
-\section{Parallelization}
-\label{sec:05}
The parallelisation of TSARM relies on the parallelization of all its
parts. More precisely, except the least-square step, all the other parts are
classical operations: dots, norm, multiplication and addition on vectors. All
these operations are easy to implement in PETSc or similar environment.
+
+
+%%%*********************************************************
+%%%*********************************************************
+
+\section{Convergence results}
+\label{sec:04}
+
+
+
+
%%%*********************************************************
%%%*********************************************************
\section{Experiments using petsc}
-\label{sec:06}
+\label{sec:05}
In order to see the influence of our algorithm with only one processor, we first
%%%*********************************************************
%%%*********************************************************
\section{Conclusion}
-\label{sec:07}
+\label{sec:06}
%The conclusion goes here. this is more of the conclusion
%%%*********************************************************
%%%*********************************************************