$P$ has a stationary distribution $\pi$, then we define
$$d(t)=\max_{X\in\Bool^{\mathsf{N}}}\tv{P^t(X,\cdot)-\pi}.$$
+\ANNOT{incohérence de notation $X$ : entier ou dans $B^N$ ?}
and
$$t_{\rm mix}(\varepsilon)=\min\{t \mid d(t)\leq \varepsilon\}.$$
Intuitively speaking, $t_{\rm mix}$ is a mixing time
-\textit{i.e.}, is the time until the matrix $X$ of a Markov chain
+\textit{i.e.}, is the time until the matrix $X$ \ANNOT{pas plutôt $P$ ?} of a Markov chain
is $\epsilon$-close to a stationary distribution.
\subsection{Upper bound of Stopping Time}\label{sub:stop:bound}
-A stopping time $\tau$ is a {\emph strong stationary time} if $X_{\tau}$ is
+A stopping time $\tau$ is a \emph{strong stationary time} if $X_{\tau}$ is
independent of $\tau$.
\end{algorithm}
Practically speaking, for each number $\mathsf{N}$, $ 3 \le \mathsf{N} \le 16$,
-10 functions have been generaed according to method presented in section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$
+10 functions have been generated according to method presented in section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$
is executed 10000 times with a random seed. The Figure~\ref{fig:stopping:moy}
summarizes these results. In this one, a circle represents the
approximation of $E[\ts]$ for a given $\mathsf{N}$.