-The major approach is
-to divide/organize the sensors into a suitable number of set covers
-where each set completely covers an interest region and to activate
-these set covers successively. The centralized algorithms always provide nearly or close to optimal solution since the algorithm has global view of the whole network. Note that centralized algorithms have the advantage of requiring very low processing power from the sensor nodes, which usually have
-limited processing capabilities.
-
-The first algorithms proposed in the literature consider that the cover
-sets are disjoint: a sensor node appears in exactly one of the
-generated cover sets. For instance, Slijepcevic and Potkonjak
-\cite{Slijepcevic01powerefficient} propose an algorithm, which
-allocates sensor nodes in mutually independent sets to monitor an area
-divided into several fields. Their algorithm builds a cover set by
-including in priority the sensor nodes, which cover critical fields,
-that is to say fields that are covered by the smallest number of
-sensors. The time complexity of their heuristic is $O(n^2)$ where $n$
-is the number of sensors. Abrams et al.~\cite{abrams2004set} design three approximation
-algorithms for a variation of the set k-cover problem, where the
-objective is to partition the sensors into covers such that the number
-of covers that includes an area, summed over all areas, is maximized.
-Their work builds upon previous work
-in~\cite{Slijepcevic01powerefficient} and the generated cover sets do
-not provide complete coverage of the monitoring zone.
-\cite{cardei2005improving} propose a method to efficiently
-compute the maximum number of disjoint set covers such that each set
-can monitor all targets. They first transform the problem into a
-maximum flow problem, which is formulated as a mixed integer
-programming (MIP). Then their heuristic uses the output of the MIP to
-compute disjoint set covers. Results show that this heuristic
-provides a number of set covers slightly larger compared to
-\cite{Slijepcevic01powerefficient} but with a larger execution time
-due to the complexity of the mixed integer programming resolution.
-
-Zorbas et al. \cite{zorbas2010solving} presented a centralised greedy
-algorithm for the efficient production of both node disjoint
-and non-disjoint cover sets. Compared to algorithm's results of Slijepcevic and Potkonjak
-\cite{Slijepcevic01powerefficient}, their heuristic produces more
-disjoint cover sets with a slight growth rate in execution time. When producing non-disjoint cover sets, both Static-CCF and Dynamic-CCF provide cover sets offering longer network lifetime than those produced by
-\cite{cardei2005energy}. Also, they require a smaller number of node participations in order to
-achieve these results.
-
-In the case of non-disjoint algorithms \cite{pujari2011high}, sensors may
-participate in more than one cover set. In some cases, this may
-prolong the lifetime of the network in comparison to the disjoint
-cover set algorithms, but designing algorithms for non-disjoint cover
-sets generally induces a higher order of complexity. Moreover, in
-case of a sensor's failure, non-disjoint scheduling policies are less
-resilient and less reliable because a sensor may be involved in more
-than one cover sets. For instance, Cardei et al.~\cite{cardei2005energy}
-present a linear programming (LP) solution and a greedy approach to
-extend the sensor network lifetime by organizing the sensors into a
-maximal number of non-disjoint cover sets. Simulation results show
-that by allowing sensors to participate in multiple sets, the network
-lifetime increases compared with related
-work~\cite{cardei2005improving}. In~\cite{berman04}, the
-authors have formulated the lifetime problem and suggested another
-(LP) technique to solve this problem. A centralized solution based on the Garg-K\"{o}nemann
-algorithm~\cite{garg98}, provably near
-the optimal solution, is also proposed.
+The major approach is to divide/organize the sensors into a suitable number of
+set covers where each set completely covers an interest region and to activate
+these set covers successively. The centralized algorithms always provide nearly
+or close to optimal solution since the algorithm has global view of the whole
+network. Note that centralized algorithms have the advantage of requiring very
+low processing power from the sensor nodes, which usually have limited
+processing capabilities. The main drawback of this kind of approach is its
+higher cost in communications, since the node that will take the decision needs
+information from all the sensor nodes. Moreover, centralized approaches usually
+suffer from the scalability problem, making them less competitive as the network
+size increases.
+
+The first algorithms proposed in the literature consider that the cover sets are
+disjoint: a sensor node appears in exactly one of the generated cover sets. For
+instance, Slijepcevic and Potkonjak \cite{Slijepcevic01powerefficient} proposed
+an algorithm, which allocates sensor nodes in mutually independent sets to
+monitor an area divided into several fields. Their algorithm builds a cover set
+by including in priority the sensor nodes which cover critical fields, that is
+to say fields that are covered by the smallest number of sensors. The time
+complexity of their heuristic is $O(n^2)$ where $n$ is the number of
+sensors. Abrams et al.~\cite{abrams2004set} designed three approximation
+algorithms for a variation of the set k-cover problem, where the objective is to
+partition the sensors into covers such that the number of covers that include an
+area, summed over all areas, is maximized. Their work builds upon previous work
+in~\cite{Slijepcevic01powerefficient} and the generated cover sets do not
+provide complete coverage of the monitoring zone.
+
+\cite{cardei2005improving} proposed a method to efficiently compute the maximum
+number of disjoint set covers such that each set can monitor all targets. They
+first transform the problem into a maximum flow problem, which is formulated as
+a mixed integer programming (MIP). Then their heuristic uses the output of the
+MIP to compute disjoint set covers. Results show that this heuristic provides a
+number of set covers slightly larger compared to
+\cite{Slijepcevic01powerefficient}, but with a larger execution time due to the
+complexity of the mixed integer programming resolution.
+
+Zorbas et al. \cite{zorbas2010solving} presented a centralized greedy algorithm
+for the efficient production of both node disjoint and non-disjoint cover
+sets. Compared to algorithm's results of Slijepcevic and Potkonjak
+\cite{Slijepcevic01powerefficient}, their heuristic produces more disjoint cover
+sets with a slight growth rate in execution time. When producing non-disjoint
+cover sets, both Static-CCF and Dynamic-CCF algorithms, where CCF means that
+they use a cost function called Critical Control Factor, provide cover sets
+offering longer network lifetime than those produced by
+\cite{cardei2005energy}. Also, they require a smaller number of node
+participations in order to achieve these results.
+
+In the case of non-disjoint algorithms \cite{pujari2011high}, sensors may
+participate in more than one cover set. In some cases, this may prolong the
+lifetime of the network in comparison to the disjoint cover set algorithms, but
+designing algorithms for non-disjoint cover sets generally induces a higher
+order of complexity. Moreover, in case of a sensor's failure, non-disjoint
+scheduling policies are less resilient and less reliable because a sensor may be
+involved in more than one cover sets. For instance, Cardei et
+al.~\cite{cardei2005energy} present a linear programming (LP) solution and a
+greedy approach to extend the sensor network lifetime by organizing the sensors
+into a maximal number of non-disjoint cover sets. Simulation results show that
+by allowing sensors to participate in multiple sets, the network lifetime
+increases compared with related work~\cite{cardei2005improving}.
+In~\cite{berman04}, the authors have formulated the lifetime problem and
+suggested another (LP) technique to solve this problem. A centralized solution
+based on the Garg-K\"{o}nemann algorithm~\cite{garg98}, provably near the
+optimal solution, is also proposed.