From: Sylvain C-V Date: Mon, 27 Jun 2016 19:02:31 +0000 (+0200) Subject: Fusion modifs X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/16dcc.git/commitdiff_plain/c069ead2bda7032833d2d7f3d0851906ec4d22f0?hp=-c Fusion modifs --- c069ead2bda7032833d2d7f3d0851906ec4d22f0 diff --combined main.pdf index c5051f5,49fc144..a73e383 Binary files differ diff --combined stopping.tex index 7514341,aa13c9a..a6b821a --- a/stopping.tex +++ b/stopping.tex @@@ -33,7 -33,7 +33,7 @@@ P=\dfrac{1}{6} \left 0&0&0&0&1&0&4&1 \\ 0&0&0&1&0&1&0&4 \end{array} - \right) + \right). \] \end{xpl} @@@ -65,14 -65,17 +65,18 @@@ distribution induced by the $X$-th row $P$ has a stationary distribution $\pi$, then we define $$d(t)=\max_{X\in\Bool^{\mathsf{N}}}\tv{P^t(X,\cdot)-\pi}.$$ +\ANNOT{incohérence de notation $X$ : entier ou dans $B^N$ ?} and $$t_{\rm mix}(\varepsilon)=\min\{t \mid d(t)\leq \varepsilon\}.$$ - Intuitively speaking, $t_{\rm mix}$ is a mixing time - \textit{i.e.}, is the time until the matrix $X$ \ANNOT{pas plutôt $P$ ?} of a Markov chain - is $\epsilon$-close to a stationary distribution. + %% Intuitively speaking, $t_{\rm mix}$ is a mixing time + %% \textit{i.e.}, is the time until the matrix $X$ of a Markov chain + %% is $\epsilon$-close to a stationary distribution. + + Intutively speaking, $t_{\rm mix}(\varepsilon)$ is the time/steps required -to be sure to be $\varepsilon$-close to the staionary distribution, wherever ++to be sure to be $\varepsilon$-close to the stationary distribution, wherever + the chain starts. @@@ -114,9 -117,9 +118,8 @@@ $$\P_X(X_\tau=Y)=\pi(Y).$ \subsection{Upper bound of Stopping Time}\label{sub:stop:bound} -- - A stopping time $\tau$ is a \emph{strong stationary time} if $X_{\tau}$ is - independent of $\tau$. + A stopping time $\tau$ is a {\emph strong stationary time} if $X_{\tau}$ is + independent of $\tau$. The following result will be useful~\cite[Proposition~6.10]{LevinPeresWilmer2006}, \begin{thrm}\label{thm-sst} @@@ -232,7 -235,8 +235,8 @@@ This probability is independent of the Moving next in the chain, at each step, the $l$-th bit is switched from $0$ to $1$ or from $1$ to $0$ each time with the same probability. Therefore, for $t\geq \tau_\ell$, the - $\ell$-th bit of $X_t$ is $0$ or $1$ with the same probability, proving the + $\ell$-th bit of $X_t$ is $0$ or $1$ with the same probability, and + independently of the value of the other bits, proving the lemma.\end{proof} \begin{thrm} \label{prop:stop} @@@ -346,7 -350,7 +350,7 @@@ direct application of lemma~\ref{prop:l \end{proof} Now using Markov Inequality, one has $\P_X(\tau > t)\leq \frac{E[\tau]}{t}$. - With $t=32N^2+16N\ln (N+1)$, one obtains: $\P_X(\tau > t)\leq \frac{1}{4}$. + With $t_n=32N^2+16N\ln (N+1)$, one obtains: $\P_X(\tau > t_n)\leq \frac{1}{4}$. Therefore, using the defintion of $t_{\rm mix)}$ and Theorem~\ref{thm-sst}, it follows that $t_{\rm mix}\leq 32N^2+16N\ln (N+1)=O(N^2)$. @@@ -355,11 -359,11 +359,11 @@@ Notice that the calculus of the stationary time upper bound is obtained under the following constraint: for each vertex in the $\mathsf{N}$-cube there are one ongoing arc and one outgoing arc that are removed. - The calculus does not consider (balanced) Hamiltonian cycles, which + The calculus doesn't consider (balanced) Hamiltonian cycles, which are more regular and more binding than this constraint. Moreover, the bound - is obtained using Markov Inequality which is frequently coarse. For the - classical random walkin the $\mathsf{N}$-cube, without removing any + is obtained using the coarse Markov Inequality. For the + classical (lazzy) random walk the $\mathsf{N}$-cube, without removing any Hamiltonian cylce, the mixing time is in $\Theta(N\ln N)$. We conjecture that in our context, the mixing time is also in $\Theta(N\ln N)$. @@@ -402,7 -406,7 +406,7 @@@ $\textit{fair}\leftarrow\emptyset$\ \end{algorithm} Practically speaking, for each number $\mathsf{N}$, $ 3 \le \mathsf{N} \le 16$, -10 functions have been generaed according to method presented in section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$ +10 functions have been generated according to method presented in section~\ref{sec:hamilton}. For each of them, the calculus of the approximation of $E[\ts]$ is executed 10000 times with a random seed. The Figure~\ref{fig:stopping:moy} summarizes these results. In this one, a circle represents the approximation of $E[\ts]$ for a given $\mathsf{N}$.