X-Git-Url: https://bilbo.iut-bm.univ-fcomte.fr/and/gitweb/JournalMultiPeriods.git/blobdiff_plain/9458c6e7496a265a0e594312416aa8dd608e4296..ce36745985ea29d243398f943d61ea7a96e16d15:/article.tex diff --git a/article.tex b/article.tex index 36c5dcf..06b041f 100644 --- a/article.tex +++ b/article.tex @@ -199,25 +199,25 @@ size increases. The first algorithms proposed in the literature consider that the cover sets are disjoint: a sensor node appears in exactly one of the generated cover sets. For -instance, Slijepcevic and Potkonjak \cite{Slijepcevic01powerefficient} proposed -an algorithm, which allocates sensor nodes in mutually independent sets to -monitor an area divided into several fields. Their algorithm builds a cover set -by including in priority the sensor nodes which cover critical fields, that is -to say fields that are covered by the smallest number of sensors. The time +instance, Slijepcevic and Potkonjak \cite{Slijepcevic01powerefficient} have +proposed an algorithm, which allocates sensor nodes in mutually independent sets +to monitor an area divided into several fields. Their algorithm builds a cover +set by including in priority the sensor nodes which cover critical fields, that +is to say fields that are covered by the smallest number of sensors. The time complexity of their heuristic is $O(n^2)$ where $n$ is the number of sensors. -Abrams et al.~\cite{abrams2004set} designed three approximation algorithms for a -variation of the set k-cover problem, where the objective is to partition the -sensors into covers such that the number of covers that include an area, summed -over all areas, is maximized. Their work builds upon previous work +Abrams et al.~\cite{abrams2004set} have designed three approximation algorithms +for a variation of the set k-cover problem, where the objective is to partition +the sensors into covers such that the number of covers that include an area, +summed over all areas, is maximized. Their work builds upon previous work in~\cite{Slijepcevic01powerefficient} and the generated cover sets do not provide complete coverage of the monitoring zone. -\cite{cardei2005improving} proposed a method to efficiently compute the maximum -number of disjoint set covers such that each set can monitor all targets. They -first transform the problem into a maximum flow problem, which is formulated as -a mixed integer programming (MIP). Then their heuristic uses the output of the -MIP to compute disjoint set covers. Results show that this heuristic provides a -number of set covers slightly larger compared to +In \cite{cardei2005improving}, the authors have proposed a method to efficiently +compute the maximum number of disjoint set covers such that each set can monitor +all targets. They first transform the problem into a maximum flow problem, which +is formulated as a mixed integer programming (MIP). Then their heuristic uses +the output of the MIP to compute disjoint set covers. Results show that this +heuristic provides a number of set covers slightly larger compared to \cite{Slijepcevic01powerefficient}, but with a larger execution time due to the complexity of the mixed integer programming resolution. @@ -286,27 +286,27 @@ its status by a rule named off-duty eligible rule, which tells him to turn off if its sensing area is covered by its neighbors. A back-off scheme is introduced to let each sensor delay the decision process with a random period of time, in order to avoid simultaneous conflicting decisions between nodes and lack of -coverage on any area. \cite{prasad2007distributed} defines a model for -capturing the dependencies between different cover sets and proposes localized +coverage on any area. In \cite{prasad2007distributed} a model for capturing the +dependencies between different cover sets is defined and it proposes localized heuristic based on this dependency. The algorithm consists of two phases, an initial setup phase during which each sensor computes and prioritizes the covers and a sensing phase during which each sensor first decides its on/off status, and then remains on or off for the rest of the duration. -The authors in \cite{yardibi2010distributed} developed a Distributed Adaptive -Sleep Scheduling Algorithm (DASSA) for WSNs with partial coverage. DASSA does -not require location information of sensors while maintaining connectivity and -satisfying a user defined coverage target. In DASSA, nodes use the residual -energy levels and feedback from the sink for scheduling the activity of their -neighbors. This feedback mechanism reduces the randomness in scheduling that -would otherwise occur due to the absence of location information. In -\cite{ChinhVu}, the author proposed a novel distributed heuristic, called -Distributed Energy-efficient Scheduling for k-coverage (DESK), which ensures -that the energy consumption among the sensors is balanced and the lifetime -maximized while the coverage requirement is maintained. This heuristic works in -rounds, requires only one-hop neighbor information, and each sensor decides its -status (active or sleep) based on the perimeter coverage model proposed in -\cite{Huang:2003:CPW:941350.941367}. +The authors in \cite{yardibi2010distributed} have developed a Distributed +Adaptive Sleep Scheduling Algorithm (DASSA) for WSNs with partial coverage. +DASSA does not require location information of sensors while maintaining +connectivity and satisfying a user defined coverage target. In DASSA, nodes use +the residual energy levels and feedback from the sink for scheduling the +activity of their neighbors. This feedback mechanism reduces the randomness in +scheduling that would otherwise occur due to the absence of location +information. In \cite{ChinhVu}, the author have proposed a novel distributed +heuristic, called Distributed Energy-efficient Scheduling for k-coverage (DESK), +which ensures that the energy consumption among the sensors is balanced and the +lifetime maximized while the coverage requirement is maintained. This heuristic +works in rounds, requires only one-hop neighbor information, and each sensor +decides its status (active or sleep) based on the perimeter coverage model +proposed in \cite{Huang:2003:CPW:941350.941367}. %Our Work, which is presented in~\cite{idrees2014coverage} proposed a coverage optimization protocol to improve the lifetime in %heterogeneous energy wireless sensor networks. @@ -315,19 +315,20 @@ status (active or sleep) based on the perimeter coverage model proposed in The works presented in \cite{Bang, Zhixin, Zhang} focuses on coverage-aware, distributed energy-efficient, and distributed clustering methods respectively, which aims to extend the network lifetime, while the coverage is ensured. S. -Misra et al. \cite{Misra} proposed a localized algorithm for coverage in sensor -networks. The algorithm conserve the energy while ensuring the network coverage -by activating the subset of sensors with the minimum overlap area. The proposed -method preserves the network connectivity by formation of the network backbone. -More recently, Shibo et al. \cite{Shibo} expressed the coverage problem as a -minimum weight submodular set cover problem and proposed a Distributed Truncated -Greedy Algorithm (DTGA) to solve it. They take advantage from both temporal and -spatial correlations between data sensed by different sensors, and leverage -prediction, to improve the lifetime. In \cite{xu2001geography}, Xu et -al. proposed an algorithm, called Geographical Adaptive Fidelity (GAF), which -uses geographic location information to divide the area of interest into fixed -square grids. Within each grid, it keeps only one node staying awake to take the -responsibility of sensing and communication. +Misra et al. \cite{Misra} have proposed a localized algorithm for coverage in +sensor networks. The algorithm conserve the energy while ensuring the network +coverage by activating the subset of sensors with the minimum overlap area. The +proposed method preserves the network connectivity by formation of the network +backbone. More recently, Shibo et al. \cite{Shibo} have expressed the coverage +problem as a minimum weight submodular set cover problem and proposed a +Distributed Truncated Greedy Algorithm (DTGA) to solve it. They take advantage +from both temporal and spatial correlations between data sensed by different +sensors, and leverage prediction, to improve the lifetime. In +\cite{xu2001geography}, Xu et al. have proposed an algorithm, called +Geographical Adaptive Fidelity (GAF), which uses geographic location information +to divide the area of interest into fixed square grids. Within each grid, it +keeps only one node staying awake to take the responsibility of sensing and +communication. Some other approaches (outside the scope of our work) do not consider a synchronized and predetermined period of time where the sensors are active or