the methodology than benchmarking.
Our approach is always compared to Hugo~\cite{DBLP:conf/ih/PevnyFB10}
and to EAISLSBMR~\cite{Luo:2010:EAI:1824719.1824720}.
-The former is the less detectable information hiding tool in spatial domain
-and the later is the work which is close to ours, as far as we know.
+The former is the least detectable information hiding tool in spatial domain
+and the later is the work that is close to ours, as far as we know.
-\subsection{Image Quality}\label{sub:quality}
+\subsection{Image quality}\label{sub:quality}
The visual quality of the STABYLO scheme is evaluated in this section.
For the sake of completeness, three metrics are computed in these experiments:
the Peak Signal to Noise Ratio (PSNR),
\begin{table*}
\begin{center}
-\begin{tabular}{|c|c|c||c|c|c|c|c|}
+\begin{tabular}{|c|c|c||c|c|c|c|c|c|}
\hline
-Schemes & \multicolumn{3}{|c|}{STABYLO} & \multicolumn{2}{|c|}{HUGO}& \multicolumn{2}{|c|}{EAISLSBMR} \\
+Schemes & \multicolumn{4}{|c|}{STABYLO} & \multicolumn{2}{|c|}{HUGO}& \multicolumn{2}{|c|}{EAISLSBMR} \\
\hline
-Embedding & Fixed & \multicolumn{2}{|c|}{Adaptive} & \multicolumn{2}{|c|}{Fixed}& \multicolumn{2}{|c|}{Fixed} \\
+Embedding & Fixed & \multicolumn{3}{|c|}{Adaptive} & \multicolumn{2}{|c|}{Fixed}& \multicolumn{2}{|c|}{Fixed} \\
\hline
-Rate & 10\% & + sample & + STC & 10\%&6.35\%& 10\%&6.35\%\\
+Rate & 10\% & + sample & +STC(7) & +STC(6) & 10\%&6.35\%& 10\%&6.35\%\\
\hline
-PSNR & 61.86 & 63.48 & 66.55 (\textbf{-0.8\%}) & 64.65 & {67.08} & 60.8 & 62.9\\
+PSNR & 61.86 & 63.48 & 66.55 (\textbf{-0.8\%}) & 63.7 & 64.65 & {67.08} & 60.8 & 62.9\\
\hline
-PSNR-HVS-M & 72.9 & 75.39 & 78.6 (\textbf{-0.8\%}) & 76.67 & {79.23} & 61.3 & 63.4\\
+PSNR-HVS-M & 72.9 & 75.39 & 78.6 (\textbf{-0.8\%}) & 75.5 & 76.67 & {79.23} & 71.8 & 74.3\\
%\hline
%BIQI & 28.3 & 28.28 & 28.4 & 28.28 & 28.28 & 28.2 & 28.2\\
\hline
-wPSNR & 77.47 & 80.59 & 86.43(\textbf{-1.6\%}) & 83.03 & {87.8} & 76.7 & 80.6\\
+wPSNR & 77.47 & 80.59 & 86.43(\textbf{-1.6\%})& 86.28 & 83.03 & {87.8} & 76.7 & 80.6\\
\hline
\end{tabular}
\end{footnotesize}
\end{center}
-\caption{Quality Measures of Steganography Approaches\label{table:quality}}
+\caption{Quality measures of steganography approaches\label{table:quality}}
\end{table*}
-Results are summarized into the Table~\ref{table:quality}.
+Results are summarized in Table~\ref{table:quality}.
Let us give an interpretation of these experiments.
First of all, the adaptive strategy produces images with lower distortion
than the one of images resulting from the 10\% fixed strategy.
Numerical results are indeed always greater for the former strategy than
-for the latter.
+for the latter one.
These results are not surprising since the adaptive strategy aims at
embedding messages whose length is decided according to an higher threshold
into the edge detection.
If we combine \emph{adaptive} and \emph{STC} strategies
(which leads to an average embedding rate equal to 6.35\%)
our approach provides equivalent metrics than HUGO.
+In this column STC(7) stands for embedding data in the LSB whereas
+in STC(6), data are hidden in the two last significant bits.
+
+
+
The quality variance between HUGO and STABYLO for these parameters
is given in bold font. It is always close to 1\% which confirms
the objective presented in the motivations:
The steganalysis quality of our approach has been evaluated through the two
AUMP~\cite{Fillatre:2012:ASL:2333143.2333587}
and Ensemble Classifier~\cite{DBLP:journals/tifs/KodovskyFH12} based steganalysers.
-Both aims at detecting hidden bits in grayscale natural images and are
+Both aim at detecting hidden bits in grayscale natural images and are
considered as the state of the art of steganalysers in spatial domain~\cite{FK12}.
The former approach is based on a simplified parametric model of natural images.
Parameters are firstly estimated and an adaptive Asymptotically Uniformly Most Powerful
This approach is dedicated to verify whether LSB has been modified or not.
In the latter, the authors show that the
machine learning step, which is often
-implemented as support vector machine,
+implemented as a support vector machine,
can be favorably executed thanks to an ensemble classifier.
\begin{table*}
\begin{center}
%\begin{small}
-\begin{tabular}{|c|c|c|c|c|c|c|c|}
+\begin{tabular}{|c|c|c|c|c|c|c|c|c|}
\hline
-Schemes & \multicolumn{3}{|c|}{STABYLO} & \multicolumn{2}{|c|}{HUGO}& \multicolumn{2}{|c|}{EAISLSBMR}\\
+Schemes & \multicolumn{4}{|c|}{STABYLO} & \multicolumn{2}{|c|}{HUGO}& \multicolumn{2}{|c|}{EAISLSBMR}\\
\hline
-Embedding & Fixed & \multicolumn{2}{|c|}{Adaptive} & \multicolumn{2}{|c|}{Fixed}& \multicolumn{2}{|c|}{Fixed} \\
+Embedding & Fixed & \multicolumn{3}{|c|}{Adaptive} & \multicolumn{2}{|c|}{Fixed}& \multicolumn{2}{|c|}{Fixed} \\
\hline
-Rate & 10\% & + sample & + STC & 10\%& 6.35\%& 10\%& 6.35\%\\
+Rate & 10\% & + sample & +STC(7) & +STC(6) & 10\%& 6.35\%& 10\%& 6.35\%\\
\hline
-AUMP & 0.22 & 0.33 & 0.39 & 0.50 & 0.50 & 0.49 & 0.50 \\
+AUMP & 0.22 & 0.33 & 0.39 & 0.45 & 0.50 & 0.50 & 0.49 & 0.50 \\
\hline
-Ensemble Classifier & 0.35 & 0.44 & 0.47 & 0.48 & 0.49 & 0.43 & 0.46 \\
+Ensemble Classifier & 0.35 & 0.44 & 0.47 & 0.47 & 0.48 & 0.49 & 0.43 & 0.46 \\
\hline
\end{tabular}
is the most secure steganographic tool, as far as we know.
However by combining \emph{adaptive} and \emph{STC} strategies
our approach obtains similar results than HUGO ones.
+
However due to its
huge number of features integration, it is not lightweight, which justifies
in the authors' opinion the consideration of the proposed method.