X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/desynchronisation-controle.git/blobdiff_plain/e02082ad2d87032e8cff4fa8cc48682c85efc624..HEAD:/IWCMC14/convexity.tex diff --git a/IWCMC14/convexity.tex b/IWCMC14/convexity.tex index de0ddbc..d88481c 100644 --- a/IWCMC14/convexity.tex +++ b/IWCMC14/convexity.tex @@ -12,6 +12,10 @@ The function inside the $\arg \min$ is strictly convex if and only if $\lambda_h$ is not null. This asymptotic configuration may arise due to the definition of $\lambda_h$. Worth, in this case, the function is strictly decreasing and the minimal value is obtained when $p$ is the infinity. +Thus, the method follows its iterative calculus +with an arbitrarely large value for $P_{sh}^{(k)}$. This leads to +a convergence which is dramatically slow down. + To prevent this configuration, we replace the objective function given in equation~(\ref{eq:obj2}) by @@ -59,9 +63,8 @@ one which is convex guarantee. The Figure~\ref{Fig:convex} summarizes the average number of convergence iterations for each treshold value. As we can see, even if this new enhanced method introduces new calculus, -it only slows few down the algorithm and guarantee the convexity, +it speeds up the algorithm and guarantees the convexity, and thus the convergence. -Notice that the encoding power has been arbitrarily limited to 10 W. \begin{figure*} \begin{center} \includegraphics[scale=0.5]{convex.png}