This section considers functions $f: \Bool^{\mathsf{N}} \rightarrow \Bool^{\mathsf{N}} $
issued from an hypercube where an Hamiltonian path has been removed
-as described in previous section.
+as described in the previous section.
Notice that the iteration graph is always a subgraph of
${\mathsf{N}}$-cube augmented with all the self-loop, \textit{i.e.}, all the
edges $(v,v)$ for any $v \in \Bool^{\mathsf{N}}$.
\begin{xpl}
Let us consider for instance
the graph $\Gamma(f)$ defined
-in \textsc{Figure~\ref{fig:iteration:f*}.} and
+in Figure~\ref{fig:iteration:f*} and
the probability function $p$ defined on the set of edges as follows:
$$
p(e) \left\{
A specific random walk in this modified hypercube is first
-introduced (See section~\ref{sub:stop:formal}). We further
+introduced (see Section~\ref{sub:stop:formal}). We further
study this random walk in a theoretical way to
provide an upper bound of fair sequences
-(See section~\ref{sub:stop:bound}).
-We finally complete these study with experimental
+(see Section~\ref{sub:stop:bound}).
+We finally complete this study with experimental
results that reduce this bound (Sec.~\ref{sub:stop:exp}).
-Notice that for a general references on Markov chains
+For a general reference on Markov chains,
see~\cite{LevinPeresWilmer2006},
and particularly Chapter~5 on stopping times.
%% \textit{i.e.}, is the time until the matrix $X$ of a Markov chain
%% is $\epsilon$-close to a stationary distribution.
-Intutively speaking, $t_{\rm mix}(\varepsilon)$ is the time/steps required
+Intuitively speaking, $t_{\rm mix}(\varepsilon)$ is the time/steps required
to be sure to be $\varepsilon$-close to the stationary distribution, wherever
the chain starts.
${\mathsf{N}}$-cube.
Let $h$ be a function from $\Bool^{\mathsf{N}}$ into $\llbracket 1, {\mathsf{N}} \rrbracket$.
Intuitively speaking $h$ aims at memorizing for each node
-$X \in \Bool^{\mathsf{N}}$ which edge is removed in the Hamiltonian cycle,
-\textit{i.e.} which bit in $\llbracket 1, {\mathsf{N}} \rrbracket$
+$X \in \Bool^{\mathsf{N}}$ whose edge is removed in the Hamiltonian cycle,
+\textit{i.e.}, which bit in $\llbracket 1, {\mathsf{N}} \rrbracket$
cannot be switched.
We denote by $\ov{h} : \Bool^{\mathsf{N}} \rightarrow \Bool^{\mathsf{N}}$ the function
such that for any $X \in \Bool^{\mathsf{N}} $,
$(X,\ov{h}(X)) \in E$ and $X\oplus\ov{h}(X)=0^{{\mathsf{N}}-h(X)}10^{h(X)-1}$.
-The function $\ov{h}$ is said {\it square-free} if for every $X\in \Bool^{\mathsf{N}}$,
+The function $\ov{h}$ is said to be {\it square-free} if for every $X\in \Bool^{\mathsf{N}}$,
$\ov{h}(\ov{h}(X))\neq X$.
\begin{lmm}\label{lm:h}
An integer $\ell\in \llbracket 1,{\mathsf{N}} \rrbracket$ is said {\it fair}
at time $t$ if there
exists $0\leq j <t$ such that $Z_{j+1}=(\ell,\cdot)$ and $h(X_j)\neq \ell$.
-In other words, there exist a date $j$ before $t$ where
+In other words, there exists a date $j$ before $t$ where
the first element of the random variable $Z$ is exactly $l$
(\textit{i.e.}, $l$ is the strategy at date $j$)
-and where the configuration $X_j$ allows to traverse the edge $l$.
+and where the configuration $X_j$ allows to cross the edge $l$.
Let $\ts$ be the first time all the elements of $\llbracket 1, {\mathsf{N}} \rrbracket$
are fair. The integer $\ts$ is a randomized stopping time for
\end{lmm}
\begin{proof}
-For every $X$, every $\ell$, one has $\P(S_{X,\ell})\leq 2)\geq
+For every $X$, every $\ell$, one has $\P(S_{X,\ell}\leq 2)\geq
\frac{1}{4{\mathsf{N}}^2}$.
Let $X_0= X$.
Indeed,
Assume that the last unfair bit is $\ell$. One has
$\ts=\ts^\prime+S_{X_\tau,\ell}$, and therefore $E[\ts] =
E[\ts^\prime]+E[S_{X_\tau,\ell}]$. Therefore, Theorem~\ref{prop:stop} is a
-direct application of lemma~\ref{prop:lambda} and~\ref{lm:stopprime}.
+direct application of Lemma~\ref{prop:lambda} and~\ref{lm:stopprime}.
\end{proof}
Now using Markov Inequality, one has $\P_X(\tau > t)\leq \frac{E[\tau]}{t}$.
With $t_n=32N^2+16N\ln (N+1)$, one obtains: $\P_X(\tau > t_n)\leq \frac{1}{4}$.
-Therefore, using the defintion of $t_{\rm mix)}$ and
+Therefore, using the definition of $t_{\rm mix}$ and
Theorem~\ref{thm-sst}, it follows that
-$t_{\rm mix}\leq 32N^2+16N\ln (N+1)=O(N^2)$.
+$t_{\rm mix}(\frac{1}{4})\leq 32N^2+16N\ln (N+1)=O(N^2)$ and that
+
Notice that the calculus of the stationary time upper bound is obtained
are more regular and more binding than this constraint.
Moreover, the bound
is obtained using the coarse Markov Inequality. For the
-classical (lazzy) random walk the $\mathsf{N}$-cube, without removing any
-Hamiltonian cylce, the mixing time is in $\Theta(N\ln N)$.
+classical (lazy) random walk the $\mathsf{N}$-cube, without removing any
+Hamiltonian cycle, the mixing time is in $\Theta(N\ln N)$.
We conjecture that in our context, the mixing time is also in $\Theta(N\ln
N)$.
-In this later context, we claim that the upper bound for the stopping time
+In this latter context, we claim that the upper bound for the stopping time
should be reduced. This fact is studied in the next section.
\subsection{Practical Evaluation of Stopping Times}\label{sub:stop:exp}
Let be given a function $f: \Bool^{\mathsf{N}} \rightarrow \Bool^{\mathsf{N}}$
and an initial seed $x^0$.
-The pseudo code given in algorithm~\ref{algo:stop} returns the smallest
+The pseudo code given in Algorithm~\ref{algo:stop} returns the smallest
number of iterations such that all elements $\ell\in \llbracket 1,{\mathsf{N}} \rrbracket$ are fair. It allows to deduce an approximation of $E[\ts]$
by calling this code many times with many instances of function and many
seeds.
}
\Return{$\textit{nbit}$}\;
%\end{scriptsize}
-\caption{Pseudo Code of stoping time calculus }
+\caption{Pseudo Code of stopping time computation}
\label{algo:stop}
\end{algorithm}
Practically speaking, for each number $\mathsf{N}$, $ 3 \le \mathsf{N} \le 16$,
-10 functions have been generated according to method presented in section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$
-is executed 10000 times with a random seed. The Figure~\ref{fig:stopping:moy}
-summarizes these results. In this one, a circle represents the
+10 functions have been generated according to the method presented in Section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$
+is executed 10000 times with a random seed. Figure~\ref{fig:stopping:moy}
+summarizes these results. A circle represents the
approximation of $E[\ts]$ for a given $\mathsf{N}$.
The line is the graph of the function $x \mapsto 2x\ln(2x+8)$.
It can firstly
be observed that the approximation is largely
-smaller than the upper bound given in theorem~\ref{prop:stop}.
+smaller than the upper bound given in Theorem~\ref{prop:stop}.
It can be further deduced that the conjecture of the previous section
-is realistic according the graph of $x \mapsto 2x\ln(2x+8)$.
+is realistic according to the graph of $x \mapsto 2x\ln(2x+8)$.