+For the whole experiment, a set of 500 images is randomly extracted
+from the database taken from the BOSS contest~\cite{Boss10}.
+In this set, each cover is a $512\times 512$
+grayscale digital image.
+
+
\subsection{Adaptive Embedding Rate}
Two strategies have been developed in our scheme with respect to the rate of
-embedding which is either \emph{ adaptive} or \emph{fixed}.
+embedding which is either \emph{adaptive} or \emph{fixed}.
In the former the embedding rate depends on the number of edge pixels.
The higher it is, the larger is the message length that can be considered.
Practically, a set of edge pixels is computed according to the
Canny algorithm with high threshold.
The message length is thus defined to be the half of this set cardinality.
-The rate between available bits and bit message length is then more than two.This constraint is indeed induced by the fact that the efficiency
+In this strategy, two methods are thus applied to extract bits that
+are modified. The first one is a direct application of the STC algorithm.
+This method is further referred as \emph{adaptive+STC}.
+The second one randomly choose the subset of pixels to modify by
+applying the BBS PRNG again. This method is denoted \emph{adaptive+sample}.
+Notice that the rate between
+available bits and bit message length is always equal to two.
+This constraint is indeed induced by the fact that the efficiency
of the STC algorithm is unsatisfactory under that threshold.
+On our experiments and with the adaptive scheme,
+the average size of the message that can be embedded is 16445.
+Its corresponds to an average payload of 6.35\%.
+
+
+
In the latter, the embedding rate is defined as a percentage between the
number of the modified pixels and the length of the bit message.
a set of edge pixels with threshold that is decreasing until its cardinality
is sufficient. If the set cardinality is more than twice larger than the
bit message length an STC step is again applied.
-Otherwise, pixels are randomly chosen from the set of pixels to build the
-subset with a given size. The BBS PRNG is again applied there.
-
+Otherwise, pixels are again randomly chosen with BBS.
\begin{table}
\begin{center}
-\begin{tabular}{|c|c|c|}
+\begin{tabular}{|c|c|c||c|c|}
\hline
-Embedding rate & Adaptive & 10 \% \\
+Schemes & \multicolumn{3}{|c|}{STABYLO} & HUGO\\
\hline
-PSNR & 66.55 & 61.86 \\
+Embedding & \multicolumn{2}{|c||}{Adaptive} & Fixed & Fixed \\
\hline
-PSNR-HVS-M & 78.6 & 72.9 \\
+Rate & + STC & + sample & 10\% & 10\%\\
\hline
-BIQI & 28.3 & 28.4 \\
+PSNR & 66.55 & 63.48 & 61.86 & 64.65 \\
\hline
-wPSNR & 86.43& 77.47 \\
+PSNR-HVS-M & 78.6 & 75.39 & 72.9 & 76.67\\
+\hline
+BIQI & 28.3 & 28.28 & 28.4 & 28.28\\
+\hline
+wPSNR & 86.43& 80.59 & 77.47& 83.03\\
\hline
\end{tabular}
\end{center}
-\caption{Quality measures of our steganography approach\label{table:quality}}
+\caption{Quality Measures of Steganography Approaches\label{table:quality}}
\end{table}
-
-Let us compare the STABYLO approach with other edge based steganography
+Let us give an interpretation of these experiments.
+First of all, the adaptive strategy produces images with lower distortion
+than the one of images resulting from the 10\% fixed strategy.
+Numerical results are indeed always greater for the former strategy than
+for the latter, except for the BIQI metrics where differences are not relevant.
+These results are not surprising since the adaptive strategy aims at
+embedding messages whose length is decided according to a higher threshold
+into the edge detection.
+Let us focus on the quality of HUGO images: with a given fixed
+embedding rate (10\%)
+HUGO always produces images whose quality is higher than the STABYLO's one.
+However, our approach nevertheless provides better results with the strategy
+adaptive+STC in a lightweight manner, as motivated in the introduction.
+
+
+Let us now compare the STABYLO approach with other edge based steganography
schemes with respect to the image quality.
-Fist off all, wPSNR and PSNR of the Edge Adaptive
-scheme detailed in~\cite{Luo:2010:EAI:1824719.1824720} are lower than ours.
+First of all, the Edge Adaptive
+scheme detailed in~\cite{Luo:2010:EAI:1824719.1824720}
+executed with a 10\% embedding rate
+has the same PSNR but a lower wPSNR than our:
+these two metrics are respectively equal to 61.9 and 68.9.
Next both the approaches~\cite{DBLP:journals/eswa/ChenCL10,Chang20101286}
focus on increasing the payload while the PSNR is acceptable, but do not
give quality metrics for fixed embedding rate from a large base of images.
algorithm.
+
+
\subsection{Steganalysis}
In the latter, the authors show that the
machine learning step, (which is often
implemented as support vector machine)
-can be a favourably executed thanks to an Ensemble Classifiers.
+can be a favorably executed thanks to an Ensemble Classifiers.
\begin{table}
\begin{center}
-\begin{tabular}{|c|c|c|c|}
+\begin{tabular}{|c|c|c|c|c|}
+\hline
+Schemes & \multicolumn{3}{|c|}{STABYLO} & HUGO\\
\hline
-Schemes & \multicolumn{2}{|c|}{STABYLO} & HUGO\\
+Embedding & \multicolumn{2}{|c|}{Adaptive} & Fixed & Fixed \\
\hline
-Embedding rate & Adaptive & 10 \% & 10 \%\\
+Rate & + STC & + sample & 10\% & 10\%\\
\hline
-AUMP & 0.39 & 0.22 & 0.50 \\
+AUMP & 0.39 & 0.33 & 0.22 & 0.50 \\
\hline
-Ensemble Classifier & 0.47 & 0.35 & 0.48 \\
+Ensemble Classifier & 0.47 & 0.44 & 0.35 & 0.48 \\
\hline
\end{tabular}
\end{table}
-Results show that our approach is more easily detectable than HUGO which is
-is the more secure steganography tool, as far we know. However due to its
+Results show that our approach is more easily detectable than HUGO, which
+is the most secure steganographic tool, as far as we know. However due to its
huge number of features integration, it is not lightweight.