+%and
+% $$t_{\rm mix}(\varepsilon)=\min\{t \mid d(t)\leq \varepsilon\}.$$
+% One can prove that \JFc{Ou cela a-t-il été fait?}
+% $$t_{\rm mix}(\varepsilon)\leq \lceil\log_2(\varepsilon^{-1})\rceil t_{\rm mix}(\frac{1}{4})$$
+
+
+
+Let $(X_t)_{t\in \mathbb{N}}$ be a sequence of $\Bool^n$ valued random
+variables. A $\mathbb{N}$-valued random variable $\tau$ is a {\it stopping
+ time} for the sequence $(X_i)$ if for each $t$ there exists $B_t\subseteq
+(\Bool^n)^{t+1}$ such that $\{\tau=t\}=\{(X_0,X_1,\ldots,X_t)\in B_t\}$.
+In other words, the event $\{\tau = t \}$ only depends on the values of
+$(X_0,X_1,\ldots,X_t)$, not on $X_k$ with $k > t$.
+
+
+\JFC{Je ne comprends pas la definition de randomized stopping time, Peut-on enrichir ?}
+
+Let $(X_t)_{t\in \mathbb{N}}$ be a Markov chain and $f(X_{t-1},Z_t)$ a
+random mapping representation of the Markov chain. A {\it randomized
+ stopping time} for the Markov chain is a stopping time for
+$(Z_t)_{t\in\mathbb{N}}$. If the Markov chain is irreducible and has $\pi$
+as stationary distribution, then a {\it stationary time} $\tau$ is a
+randomized stopping time (possibly depending on the starting position $x$),
+such that the distribution of $X_\tau$ is $\pi$:
+$$\P_x(X_\tau=y)=\pi(y).$$
+
+
+\JFC{Ou ceci a-t-il ete prouvé}
+\begin{Theo}
+If $\tau$ is a strong stationary time, then $d(t)\leq \max_{x\in\Bool^n}
+\P_x(\tau > t)$.
+\end{Theo}
+
+
+%Let $\Bool^n$ be the set of words of length $n$.
+Let $E=\{(x,y)\mid
+x\in \Bool^n, y\in \Bool^n,\ x=y \text{ or } x\oplus y \in 0^*10^*\}$.
+In other words, $E$ is the set of all the edges in the classical
+$n$-cube.
+Let $h$ be a function from $\Bool^n$ into $\llbracket 1, n \rrbracket$.
+Intuitively speaking $h$ aims at memorizing for each node
+$x \in \Bool^n$ which edge is removed in the Hamiltonian cycle,
+\textit{i.e.} which bit in $\llbracket 1, n \rrbracket$
+cannot be switched.