1 In the algorithm presented in the previous section,
2 the encoding power consumption is iteratively updated with
8 v_h^{(k)}.\dfrac{\ln(\sigma^2/D_h)}{\gamma p ^{2/3}} + \lambda_h^{(k)}p
11 The function inside the $\arg \min$ is strictly convex if and only if
12 $\lambda_h$ is not null. This asymptotic configuration may arise due to
13 the definition of $\lambda_i$. Worth, in this case, the function is
14 strictly decreasing and the minimal value is obtained when $p$ is the infinity.
16 To prevent this configuration, we replace the objective function given
17 in equation~(\ref{eq:obj2}) by
19 \sum_{i \in N }q_i^2 +
20 \delta_x \sum_{h \in V, l \in L } x_{hl}^2
21 + \delta_r\sum_{h \in V }R_{h}^2
22 + \delta_p\sum_{h \in V }P_{sh}^{\frac{8}{3}}.
25 In this equation we have first introduced new regularisation factors
26 (namely $\delta_x$, $\delta_r$, and $\delta_p$)
27 instead of the sole $\delta$.
28 This allows to further study the influence of each modification separately.
29 Next, the introduction of the rational exponent is motivated by the goal of
30 providing a strictly convex function.
32 Let us now verify that the induced function is convex.
33 Let $f: \R^{+*} \rightarrow \R$ such that $
34 f(p)= v_h.\dfrac{\ln(\sigma^2/D_h)}{\gamma p^{2/3}} + \lambda_h p
35 + \delta_p p^{8/3}$. This function is differentiable and
36 for any $x \in \R^{+*}$ and we have
39 f'(p) &=& -2/3.v_h.\dfrac{\ln(\sigma^2/D_h)}{\gamma p^{5/3}} + \lambda_h +
40 8/3.\delta_p p^{5/3} \\
41 && \dfrac {8/3.\delta_p p^{10/3} + \lambda_h p^{5/3} -2/3.v_h\ln(\sigma^2/D_h) }{p^{5/3}}
44 which is positive if and only if the numerator is.
45 Provided $p^{5/3}$ is replaced by $P$, we have a quadratic function which is strictly convex, for any value of $\lambda_h$.