$\lambda_h$ is not null. This asymptotic configuration may arise due to
the definition of $\lambda_h$. Worth, in this case, the function is
strictly decreasing and the minimal value is obtained when $p$ is the infinity.
+Thus, the method follows its iterative calculus
+with an arbitrarely large value for $P_{sh}^{(k)}$. This leads to
+a convergence which is dramatically slow down.
+
To prevent this configuration, we replace the objective function given
in equation~(\ref{eq:obj2}) by
The Figure~\ref{Fig:convex} summarizes the average number of convergence
iterations for each treshold value. As we can see, even if this new
enhanced method introduces new calculus,
-it only slows few down the algorithm and guarantee the convexity,
+it speeds up the algorithm and guarantees the convexity,
and thus the convergence.
-Notice that the encoding power has been arbitrarily limited to 10 W.
\begin{figure*}
\begin{center}
\includegraphics[scale=0.5]{convex.png}