-When $n$ is 1, $\textit{Ky}_{xy}''$ is equal to the kernel
-$\textit{Kc}_{xy}''$, and %.
-%
-%When $n$ is 1,
-the average vertical variations of the horizontal variations are
-\[
-\begin{array}{l}
-\dfrac{1}{4}
-\left[
-\left((P(0,1)-P(0,0))-(P(1,1)-P(1,0))\right)+ \right.\\
-\quad \left((P(-1,1)-P(-1,0))-(P(0,1)-P(0,0))\right)+\\
-\quad \left((P(0,0)-P(0,-1))-(P(1,0)-P(1,-1))\right)+\\
-\quad \left. \left((P(-1,0)-P(-1,-1))-(P(0,0)-P(0,-1))\right)
-\right] \\
-=\dfrac{1}{4}
-\left[ P(1,-1) -P(1,1) - P(-1,-1) + P(-1,1)\right].
-\end{array}
-\]
-which is $\textit{Ky}_{xy}''$.
-
-Let us now consider any number $n$, $1 \le n \le N$.
-Let us first investigate the vertical variations related to
-the horizontal vector $\vec{P_{0,0}P_{0,1}}$
-(respectively $\vec{P_{0,-1}P_{0,0}}$)
-of length 1 that starts from (resp. that points to) $(0,0)$.
-As with the case $n=1$, there are 2 new vectors of
-length 1, namely
-$\vec{P_{n,0}P_{n,1}}$ and $\vec{P_{-n,0}P_{-n,1}}$
-(resp.
-$\vec{P_{n,-1}P_{n,0}}$, and $\vec{P_{-n,-1}P_{-n,0}}$)
-that are vertically aligned with $\vec{P_{0,0}P_{0,1}}$
-(resp. with $\vec{P_{0,-1}P_{0,0}}$).
-
-The vertical variation is now equal to $n$. Following the case where $n$ is 1 to compute the average variation,
-the coefficients of the first and last line around the central
-vertical line are thus from left to right:
-$\dfrac{1}{4n}$,
-$\dfrac{-1}{4n}$,
-$\dfrac{-1}{4n}$, and
-$\dfrac{1}{4n}$.
-
-Cases are similar with vectors $\vec{P_{0,0}P_{0,1}}$, \ldots
-$\vec{P_{0,0}P_{0,n}}$ which respectively lead to coefficients
-$-\dfrac{1}{4 \times 2n}$, \ldots,
-$-\dfrac{1}{4 \times n.n}$, and the proof is omitted.
-Finally, let us consider the vector $\vec{P_{0,0}P_{0,1}}$
-and its vertical variations when $\delta y$ is $n-1$.
-As in the case where $n=1$, we thus obtain the coefficients
-$\dfrac{1}{4 \times (n-1)n}$ and
-$-\dfrac{1}{4 \times (n-1)n}$
- (resp. $-\dfrac{1}{4 \times (n-1)n}$ and
-$\dfrac{1}{4 \times (n-1)n}$)
-in the second line (resp. in the
-penultimate line) since the vector has length $n$
-and $\delta y$ is $n-1$.
-Coefficient in the other lines are similarly obtained and the proof is thus omitted.
-
-We are then left to compute an approximation of the partial second order derivatives
-$\dfrac{\partial^2 P}{\partial x^2}$, $\dfrac{\partial^2 P}{\partial y^2}$, and $\dfrac{\partial^2 P}{\partial x \partial y}$
-with the kernels,
-$\textit{Ky}_{x^2}''$, $\textit{Ky}_{y^2}''$, and $\textit{Ky}_{xy}''$ respectively.
-However, the size of each of these kernels is varying from $3\times3$ to $(2N+1)\times (2N+1)$.
-Let us explain the approach on the former partial derivative.
-The other can be immediately deduced.
-
-Since the objective is to detect large variations, the second order derivative is approximated as
-the maximum of the approximations. More formally,
-let $n$, $1 \le n \le N$, be an integer number and
-$\dfrac{\partial^2 P}{\partial x^2}_n$ be the result of applying the Kernel
-$\textit{Ky}_{x^2}''$ of size $(2n+1)\times (2n+1)$. The derivative
-$\dfrac{\partial^2 P}{\partial x^2}$ is defined by
+
+
+En effet, lorsque $n$ vaut 1, $\textit{Ky}_{xy}''$ se retrouve en calculant la moyenne
+des variations horizontales de la composante verticale du gradient calculé à l'aide de
+$\textit{Ky}_{y}'$. Pour cette valeur de $n$, on a
+$\textit{Ky}_{xy}'' = \textit{Kc}_{xy}''$.
+Pour chaque nombre $n$, $1 < n \le N$, $\textit{Ky}_{xy}''$ se retrouve de la même
+manière, c'est-à-dire en effectuant des moyennes de variations.
+Une preuve de la construction se trouve dans l'article~\cite{ccfg16:ip}.
+
+
+
+L'objectif est de détecter les grandes variations des dérivées premières.
+Ainsi les dérivées secondes seront approximées comme les maximums des
+matrices hessiennes obtenues lorsque $n$ varie entre $1$ et $ N$.
+
+La dérivée partielle $\dfrac{\partial^2 P}{\partial x^2}$ est définie par