From: guyeux Date: Mon, 10 Oct 2011 11:18:31 +0000 (+0200) Subject: Relecture X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/chaos1.git/commitdiff_plain/486652fa99fdbcf0992d32f910b729e648e33644 Relecture --- diff --git a/main.tex b/main.tex index 82e99bf..016243a 100644 --- a/main.tex +++ b/main.tex @@ -154,7 +154,7 @@ iterations and a class of globally recurrent MLP. The second contribution is a study of the converse problem, indeed we investigate the ability of classical multiLayer perceptrons to learn a particular family of discrete chaotic dynamical systems. This family is defined -by a Boolean vector, an update function, and a sequence defining which +by a Boolean vector, an update function, and a sequence defining the component to update at each iteration. It has been previously established that such dynamical systems are chaotically iterated (as it is defined by Devaney) when the chosen function has a strongly @@ -234,14 +234,15 @@ point dependence on initial conditions. neighborhood of its future evolution eventually overlap with any other given region. This property implies that a dynamical system cannot be broken into simpler subsystems. Intuitively, its complexity does not -allow any simplification. On the contrary, a dense set of periodic -points is an element of regularity that a chaotic dynamical system has -to exhibit. +allow any simplification. However, chaos needs some regularity to ``counteracts'' the effects of -transitivity. +transitivity. % Thus, two points close to each other can behave in a completely different manner, the first one visiting the whole space whereas the second one has a regular orbit. +In the Devaney's formulation, a dense set of periodic +points is the element of regularity that a chaotic dynamical system has +to exhibit. %\begin{definition} \label{def3} -We recall that a point $x$ is {\bf periodic point} for $f$ of +We recall that a point $x$ is a {\bf periodic point} for $f$ of period~$n \in \mathds{N}^{\ast}$ if $f^{n}(x)=x$. %\end{definition} Then, the map @@ -261,12 +262,11 @@ Let us recall that $f$ has {\bf sensitive dependence on initial $n > 0$ such that $d\left(f^{n}(x), f^{n}(y)\right) >\delta $. The value $\delta$ is called the {\bf constant of sensitivity} of $f$. -Finally, The dynamical system that iterates $f$ is {\bf chaotic +Finally, the dynamical system that iterates $f$ is {\bf chaotic according to Devaney} on $(\mathcal{X},\tau)$ if $f$ is regular, topologically transitive, and has sensitive dependence to its initial conditions. In what follows, iterations are said to be chaotic -according Devaney when corresponding dynamical system is chaotic -according Devaney. +(according to Devaney) when the corresponding dynamical system is chaotic, as it is defined in the Devaney's formulation. %Let us notice that for a metric space the last condition follows from %the two first ones~\cite{Banks92}. @@ -311,15 +311,15 @@ $i$ from $x$ to $N(i,x)$ if and only if $f_i(x)$ is $N(i,x)$. In the sequel, the {\it strategy} $S=(S^{t})^{t \in \Nats}$ is the sequence defining which component to update at time $t$ and $S^{t}$ denotes its $t-$th term. This iteration scheme that only modifies one -element at each iteration is classically referred as {\it asynchronous +element at each iteration is usually referred as {\it asynchronous iterations}. More precisely, we have for any $i$, $1\le i \le n$, \begin{equation} \left\{ \begin{array}{l} x^{0} \in \Bool^n \\ x^{t+1}_i = \left\{ \begin{array}{l} - f_i(x^t) \textrm{ if $S^t = i$} \\ - x_i^t \textrm{ otherwise} + f_i(x^t) \textrm{ if $S^t = i$}\enspace , \\ + x_i^t \textrm{ otherwise}\enspace . \end{array} \right. \end{array} \right. @@ -373,15 +373,15 @@ X^{k+1}& = & G_{f}(X^{k})\\ \right. \label{eq:Gf} \end{equation} -where $\sigma$ is the function that removes the first term of the -strategy ({\it i.e.},~$S^0$). This definition allows to links +where $\sigma$ is the so-called shift function that removes the first term of the +strategy ({\it i.e.},~$S^0$). This definition allows to link asynchronous iterations with classical iterations of a dynamical system. Note that it can be extended by considering subsets for $S^t$. To study topological properties of these iterations, we are then left to introduce a {\bf distance} $d$ between two points $(S,x)$ and -$(\check{S},\check{x})\in \mathcal{X} = \llbracket1;n\rrbracket^\Nats -\times \Bool^{n}$. It is defined by +$(\check{S},\check{x})$ in $\mathcal{X} = \llbracket1;n\rrbracket^\Nats +\times \Bool^{n}$. Let $\Delta(x,y) = 0$ if $x=y$, and $\Delta(x,y) = 1$ else, be a distance on $\mathds{B}$. The distance $d$ is defined by \begin{equation} d((S,x);(\check{S},\check{x}))=d_{e}(x,\check{x})+d_{s}(S,\check{S}) \enspace , @@ -406,10 +406,10 @@ these requirements: on the one hand its floor value reflects the difference between the cells, on the other hand its fractional part measures the difference between the strategies. -The relation between $\Gamma(f)$ and $G_f$ is clear: there exists a +The relation between $\Gamma(f)$ and $G_f$ is obvious: there exists a path from $x$ to $x'$ in $\Gamma(f)$ if and only if there exists a strategy $s$ such that iterations of $G_f$ from the initial point -$(s,x)$ reaches the configuration $x'$. Using this link, +$(s,x)$ reach the configuration $x'$. Using this link, Guyeux~\cite{GuyeuxThese10} has proven that, \begin{theorem}%[Characterization of $\mathcal{C}$] \label{Th:Caracterisation des IC chaotiques} @@ -424,7 +424,7 @@ case, iterations of the function $G_f$ as defined in Eq.~(\ref{eq:Gf}) are chaotic according to Devaney. -Let us then define two function $f_0$ and $f_1$ both in +Let us then define two functions $f_0$ and $f_1$ both in $\Bool^n\to\Bool^n$ that are used all along this paper. The former is the vectorial negation, \textit{i.e.}, $f_{0}(x_{1},\dots,x_{n}) =(\overline{x_{1}},\dots,\overline{x_{n}})$. The latter is @@ -460,7 +460,7 @@ we obtain a global recurrent neural network that behaves as follows \item When the network is activated at the $t^{th}$ iteration, the state of the system $x^t \in \mathds{B}^n$ received from the output layer and the initial term of the sequence $(S^t)^{t \in \Nats}$ - ($S^0 \in \llbracket 1;n\rrbracket$) are used to compute the new + (\textit{i.e.}, $S^0 \in \llbracket 1;n\rrbracket$) are used to compute the new output vector. This new vector, which represents the new state of the dynamical system, satisfies: \begin{equation} @@ -478,7 +478,6 @@ we obtain a global recurrent neural network that behaves as follows The behavior of the neural network is such that when the initial state is $x^0~\in~\mathds{B}^n$ and a sequence $(S^t)^{t \in \Nats}$ is given as outside input, -\JFC{en dire davantage sur l'outside world} %% TO BE UPDATED then the sequence of successive published output vectors $\left(x^t\right)^{t \in \mathds{N}^{\ast}}$ is exactly the one produced by the chaotic iterations formally described in @@ -488,7 +487,7 @@ $\left(x^t\right)^{t \in \mathds{N}^{\ast}}$, and therefore that they are equivalent reformulations of the iterations of $G_{f_0}$ in $\mathcal{X}$. Finally, since the proposed neural network is built to model the behavior of $G_{f_0}$, whose iterations are - chaotic according to + chaotic according to the Devaney's definition of chaos, we can conclude that the network is also chaotic in this sense. @@ -947,7 +946,7 @@ scheme, the strategies cannot be predicted. Let us now compare the two coding schemes. Firstly, the second scheme disturbs the learning process. In fact in this scheme the configuration is always expressed as a natural number, whereas in the -first one the number of inputs follows the increase of the boolean +first one the number of inputs follows the increase of the Boolean vectors coding configurations. In this latter case, the coding gives a finer information on configuration evolution. \JFC{Je n'ai pas compris le paragraphe precedent. Devrait être repris}