X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/desynchronisation-controle.git/blobdiff_plain/fe438aa764b5d1ad7ea61512cffbb3761eff99b7..refs/heads/master:/IWCMC14/convexity.tex?ds=inline diff --git a/IWCMC14/convexity.tex b/IWCMC14/convexity.tex index ff4656d..d88481c 100644 --- a/IWCMC14/convexity.tex +++ b/IWCMC14/convexity.tex @@ -12,6 +12,10 @@ The function inside the $\arg \min$ is strictly convex if and only if $\lambda_h$ is not null. This asymptotic configuration may arise due to the definition of $\lambda_h$. Worth, in this case, the function is strictly decreasing and the minimal value is obtained when $p$ is the infinity. +Thus, the method follows its iterative calculus +with an arbitrarely large value for $P_{sh}^{(k)}$. This leads to +a convergence which is dramatically slow down. + To prevent this configuration, we replace the objective function given in equation~(\ref{eq:obj2}) by @@ -22,7 +26,7 @@ in equation~(\ref{eq:obj2}) by + \delta_p\sum_{h \in V }P_{sh}^{\frac{8}{3}}. \label{eq:obj2p} \end{equation} -In this equation we have first introduced new regularisation factors +In this equation we have first introduced new regularization factors (namely $\delta_x$, $\delta_r$, and $\delta_p$) instead of the sole $\delta$. This allows to further separately study the influence of each factor. @@ -46,27 +50,26 @@ Provided $p^{5/3}$ is replaced by $P$, we have a quadratic function which is strictly convex, for any value of $\lambda_h$ since the discriminant is positive. -This proposed enhacement has been evaluated as follows: -10 tresholds $t$, such that $1E-5 \le t \le 1E-3$, have +This proposed enhancement has been evaluated as follows: +10 thresholds $t$, such that $1E-5 \le t \le 1E-3$, have been selected and for each of them, 10 random configurations have been generated. For each one, we store the number of iterations which is sufficient to make the dual -function variation smaller than this given treshold with +function variation smaller than this given threshold with the two approaches: either the original one ore the -one which is convex garantee. +one which is convex guarantee. The Figure~\ref{Fig:convex} summarizes the average number of convergence -iterations for each tresholdvalue. As we can see, even if this new +iterations for each treshold value. As we can see, even if this new enhanced method introduces new calculus, -it only slows few down the algorithm and garantee the convexity, +it speeds up the algorithm and guarantees the convexity, and thus the convergence. - \begin{figure*} \begin{center} \includegraphics[scale=0.5]{convex.png} \end{center} -\caption{Original Vs Convex Garantee Approaches}\label{Fig:convex} +\caption{Original Vs Convex Guarantee Approaches}\label{Fig:convex} \end{figure*}