Effectiveness of a Wireless Sensor Network (WSN) is determined by its performance and any WSN can lose packets when it is congested. Sinks collecting information from other nodes are most likely to be less efficient when there is congestion. There are many congestion control algorithms while congestion detection and control are major research areas. There are various congestion control techniques in WSNâs based on many factors but controlling congestion becomes mandatory in WSNâs for better performance. The main contribution of this paper is a congestion control mechanism for reliability and management in a WSN. This paper presents a novel technique to overcome congestion and improve performance of a WSN while detailing other routing and congestion algorithms.
Wireless sensor nodes act autonomously and interface with actuators to communicate by using wireless transmitters. A large number of sensor nodes are scattered into wide geographical regions. They form a logical network to route packets towards managing nodes called sinks or base stations. They can be used effectively in many in-accessible areas and applications like environmental monitoring. They operate on very less loads, but become active when a related event occurs. Though targeted for military application they are being deployed in civilian applications like health monitoring, habitat study , object tracking . WSN usage in mission-critical applications  which demand performance control for accomplishment thus making node placement an essential for sensing coverage. The traffic in a WSN can be both ways i.e. to and fro from the sink. The innumerous nodes in a WSN can be imagined as subnets with a sensor node for each subnet. Directed diffusion for structured data was proposed . The communication between the nodes forwarded named-data. The nodes were equipped with memory and processing ability. A node requests data by sending a query on named-data and on finding a match the results are transferred to the querying node. Intermediate nodes aggregate data and redirect them to nearby nodes. Each probe consists of a type, duration and interval called an interest. The sink periodically broadcasts interests with the latest time stamp (exploratory interests) to check on node for the required data. Based on a nodeâs confirmation, gradients are setup and data transfer occurs from a source to the sink using an optimal path.
Congestion in WSNâs
The main objective of a WSN is to achieve reliable event detection while minimizing packet losses in transmissions from sensors to the sink. The nodes sense and transport different types of information triggered by events thus leading to overcrowded information. This overcrowding called congestion results in d an increase in data losses. The nodes suddenly burst into action due to events thus flooding the sink with sudden bursts of information causing packet losses. Though many schemes have been proposed for event detection and data transmission in WSNâs, the issues of reliable data transfer needs more attention. Mechanisms providing a general set of components which can be plugged into applications, are needed. Congestion can be managed with detection and control. Congestion control study is an important part of research in WSN congestion and alleviating congestion using various techniques enhances the output of a WSN. Collision occur when all packets are sent through the same medium and when if a protocol handles collisions, it helps improve congestion control in a WSN. Applying directed diffusion in WSNâs has its own drawbacks. The nodes need to be equipped with some amount of computational capabilities and memory for storing interests with various time stamps, thus causing major overhead for sensor networks . Directed diffusionâs guaranteed data delivery becomes less significant, since data flows from several source nodes in event detection . Moreover, the limited storage capacity of the nodes cannot be increased when congestions occur and congestion control techniques like making nodes sleep and efficient congestion control protocols need to be employed, to overcome congestion.
Campbell et al. in their study observed that WSNâs were generally tolerant to loss of data packets. In spite of this observation there exists an amount of vulnerability in message losses, when the data flows from the sink to source. Their proposed Pump Slowly Fetch Quickly (PSFQ) mechanism for WSNâs supported a simple and scalable yet customized transport scheme to meet the requirements of reliable data transfer. PSFQ is based on propagation of data from source node by injecting data at relatively low speed and allowing nodes that experience data loss to fetch any missing data packets from immediate neighbors by requesting for retransmission. The problem with PSFQ is that the authors assume that packet loss based on poor quality of wireless links, while the traffic congestion is not considered making it an unrealistic assumption for WSNâs .
...(download the rest of the essay above)