Let
$x=(x_1,\ldots,x_n)$ be the $n$-bits cover vector of the image $X$,
$m$ be the message to embed, and
Let
$x=(x_1,\ldots,x_n)$ be the $n$-bits cover vector of the image $X$,
$m$ be the message to embed, and
$m$ for a given binary matrix $H$.
Let us explain this embedding on a small illustrative example where
$m$ for a given binary matrix $H$.
Let us explain this embedding on a small illustrative example where
-$\rho_X(i,x,y)$ is identically equal to 1,
+$\rho_X(i,x,y)$ is equal to 1,
whereas $m$ and $x$ are respectively a 3 bits column
vector and a 7 bits column vector.
Let then $H$ be the binary Hamming matrix
whereas $m$ and $x$ are respectively a 3 bits column
vector and a 7 bits column vector.
Let then $H$ be the binary Hamming matrix
First of all, Filler \emph{et al.} compute the matrix $H$
by placing a small sub-matrix $\hat{H}$ of size $h × w$ next
First of all, Filler \emph{et al.} compute the matrix $H$
by placing a small sub-matrix $\hat{H}$ of size $h × w$ next
Thanks to this special form of $H$, one can represent
every solution of $m=Hy$ as a path through a trellis.
Thanks to this special form of $H$, one can represent
every solution of $m=Hy$ as a path through a trellis.
\begin{enumerate}
\item Forward construction of the trellis that depends on $\hat{H}$, on $x$, on $m$, and on $\rho$.
\item Backward determination of $y$ that minimizes $D$, starting with
\begin{enumerate}
\item Forward construction of the trellis that depends on $\hat{H}$, on $x$, on $m$, and on $\rho$.
\item Backward determination of $y$ that minimizes $D$, starting with