Survey Paper: Amplifying the Persistency of Wireless Sensor Network Using Implicit Backbone Scheduling

Amoolya P

Abstract

Wireless Sensor Networks (WSNs) are vital in various applications which involve long life, low cost maintenance and actuating. In such applications, battery act as the sole energy source. Therefore, energy regularization will be critical. It is observed that in most of the WSN applications redundant sensor nodes are used to achieve fault tolerance and Quality of Service in sensing. Whenever there is light traffic and fewer loads, then there is no need of creating redundant sensor nodes. In this paper, we have done a survey on methods of reducing the power consumption in wireless sensor networks and present a software design technique for sleep scheduling called Implicit Backbone Scheduling (IBS). IBS is designed in WSNs which has redundant sensor nodes. IBS forms multiple overlapped backbones which work alternatively to increase the network persistency. In IBS, the original nodes will be inactive with battery turned off and traffic will be forwarded by backbone sensor nodes. The backbone nodes rotate and makes sure the energy consumption is regularized and fully utilized in the network.


Processing and Analysis of Experimental Data for the Impact Residual Strain (εresidual) of the Steel Quality Using Design Expert Software

Malush Mjaku

Abstract

Object of this study is the steel quality J55 API 5CT and the process of pipe forming Ø139.7×7.72[mm], Ø244.5×8.94[mm], and Ø323.9×7.10[mm], with longitudinal seam pipes-ERW Aim of this paper is to study the impact of plastic deformation degree in the cold of residual strain in the cross section area of steel quality pipes J55 API 5CT1. For the realization of this study we have used the planning method of the experiment with one-factor. We have built the mathematical model for the experiment with one index (residual strain) and with one factor (deformation degree in the cold) and with three deformation levels. The results obtained in an experimental method are shown in the table and are processed in an analytical way while implementing the one factored experiments2.


A Study of Web Mining and Knowledge Discovery

Anil Kumar Sinha, Nidhi Raj, Ritesh Kumar and N.K.Singh

Abstract

Web mining refers to the overall process of discovering potentially useful and previously unknown information or knowledge from the Web data.” With the large amount of information available online, the Web is a fertile area for data mining and knowledge discovery. Data mining has become so many key features for detecting fraud, assessing risk, and product retailing, In Web mining, data can be collected at the server-side, client-side, proxy servers, or obtained from an organization's consolidated web data. web mining can be categorized into three areas Web Content Mining, Web Structure Mining and Web Usage Mining. Web mining is the application of data mining techniques to extract knowledge from web data, where at least one of structure (hyperlink) or usage (web log) data is used in the mining process (with or without other types of web data)


Face Authentication Based on Surf Features

Anindita Sinha, Mainak Nath, Nisha Sarkar, Sumit Bajaj, Prof. Samir k. bandyopadhyay

Abstract

Face matching is an important application of Image processing that has wide usage in security management. It can be used to match a given face with any other face and to take the decision that the two persons are same or not. This can help to identify people from a database of faces, or can be used in live video stream to identify a particular person. Given images of a given object, feature detection and matching algorithms try to repeatedly detect the same point of interest in every image, regardless of the scale and orientation of the object, and match each point of interest from one image with the corresponding point in another image. This paper presents face matching algorithm based on image intensity.


EEG Signal for Epilepsy Detection: A Review

Madhurima Banerjee, Ranjita Chowdhury, and Samir Kumar Bandyopadhyay

Abstract

EEG (electroencephalogram) is used for capturing the impulses following through the brain. The signals are recorded to check any abnormalities in working of the brain. The impulses so recorded can be contaminated with noise which needs to be filtered to get the actual brain signal. Normal brain signals differ much from abnormal brain signals. The cleansed signal so obtain can check for various brain disorders, epilepsy being one of them. The paper is an overview of how EEG works and the filtering process and reviews few algorithms that are being used to detect epilepsy. 


Extraction of Wound Measurement Process for Healing the Wound Using Color Images in Digital Analysis

Dr. K Sundeep Kumar, M Mallesha, Dr. P.Hariprabakaran, R.V. Prakash

Abstract

The wide use of digital cameras and free hand wound imaging are adopted at present in the area of clinical settings. However, there is still a great demand for precise, user friendly techniques to carry out the wound measurements to access the dimensions, healing and tissue clarification of a wound. The wound healing is an accurate measure to deal with the different wound images efficiently and effectively. The quantification process is implemented to retrieve wound healing by considering the tissue zones and the complete area measurements. The basic research goals are explained in detail in this paper to assess the metrics dealing with wound images throughout the actual area of the wound, the measurement of wound volume and also the outline of the wound to retrieve accurate results using color images in digital analysis. There is a need to use an innovative tool to retrieve accurate results of wound assessment and measurement though it is critical.


Investigation of Challenges in Existing Debugging Mechanisms for Conventional Software

Manas Kumar Yogi, Mr. G. Vijay Kumar

Abstract

Debugging is a methodical process of determining and reducing the number of bugs, or defects, in a software or hardware, thus making it behave as expected. Debugging becomes harder when various subsystems are tightly coupled, as changes in one may cause bugs to emerge in another. Debugging is difficult due to various reasons like time constraints and other organizational or environmental constraints imposed on the debugger. In this paper we explore the basic techniques including both manual and automated tools for debugging. We look into the current hindrances faced by the software debuggers and propose few ideas to reduce the effect of challenges posed by the constraints while debugging software.