i-manager's Journal on Software Engineering (JSE)


Volume 1 Issue 2 October - December 2006

Article

Performance Evaluation of Exponential Congestion Control Algorithm

Raja Ram M* , Ramyalakshmi D**, Venkateshkumar D***
*Asst Prof .EEE Thanthai Periyar Goverment Institute Vellore.
**,*** Dept of CSE Dhanalakshmi Srinivasan Engg College Perambalur.
Raja Ram M, Ramyalakshmi D and Venkateshkumar D (2006). Performance Evaluation of Exponential Congestion Control Algorithm. i-manager’s Journal on Software Engineering, 1(2), 14-21. https://doi.org/10.26634/jse.1.2.762

Abstract

The TCP protocol is used by the majority of the network applications on the Internet. TCP performance is strongly influenced by its congestion control algorithms that limit the amount of transmitted traffic based on the estimated network capacity and utilization. Because the freely available Linux operating system has gained popularity especially in the network servers, its TCP implementation affects many of the network interactions carried out today. This study introduces and analyses a class of non-linear congestion control algorithms called Exponential congestion control algorithms. This algorithm provide additive increase using a Exponential of the inverse of the current window size and provide multiplicative decrease using the Exponential of the current window size. They are further parameterized by a and ß. The results of simulation are compared with that of the TCP variants such as TCP, TCP/Reno, TCP/Sack1, TCP/Fack TCP/Vegas and TCP-EXPO. The Comparison shows that Exponential Congestion Control algorithm performs better in terms of throughput.

Article

A Measure of Software Consistency, Growth and Total Quality Management: A Systems Dynamics Approach

Kumar Saurabh*
*Sr .Executive ,Sathyam Computer Services Ltd,Hyderabad.
Kumar Saurabh (2006). A Measure of Software Consistency, Growth and Total Quality Management: A Systems Dynamics Approach. i-manager’s Journal on Software Engineering, 1(2),22-26. https://doi.org/10.26634/jse.1.2.766

Abstract

In recent years, many computer system failures have been caused by software faults that were introduced during the software development process. This is an inevitable problem, since a software system installed in the computer system is an intellectual product consisting of documents and source programs developed by human activities. Then, Total Quality Management (TQM) is considered to be one of the key technologies to produce more high quality software products. Generally, a software failure caused by software faults latent in the system, cannot occur except for certain special occasion when a set of special data is put into the system under a special condition, i.e. the program path including software faults is executed. The quality characteristic of software consistency is, that computer systems can continue to operate regularly without the occurrence of failures on software systems. In this paper, we discuss a quantitative technique for software quality/consistency measurement and assessment, as one of the key software consistency technologies, which is a so-called software consistency model and its applications.

Article

An Intelligent Model of Syntactic Parser for a Natural Language Understanding Interface

T. Jenitha Jeba Sheeli* , Joseph Raj V**
*Research Scholar ,Mother Teresa Women's University ,Kodaikanal
**Head ,Dept of ComputerScience ,Kamaraj College ,Thoothukudi
T. Jenitha Jeba Sheeli and Joseph Raj V (2006). An Intelligent Model of Syntactic Parser for a Natural Language Understanding Interface. i-manager’s Journal on Software Engineering, 1(2), 27-33. https://doi.org/10.26634/jse.1.2.769

Abstract

Natural Language Understanding Interface allows the user to interact with the computer in a flexible and friendly manner. This paper describes a neural network model of syntactic parser for a Natural language understanding interface. This intelligent parser performs the parsing in two steps: Parts-of-speech tagging is carried out by a Back Propagation Network that takes an English word as input and generates as output the associated parts-of-speech tags; parse structure generation is done by a similar intelligent network that uses the output of the parts-of-speech tagging network and grammar rules. The proposed Back Propagation Network uses genetic algorithm for weight determination. Being a robust and optimization technique, Genetics Algorithm (GA) outperforms the gradient based conventional training algorithm, giving solution fairly accurately and quickly in less number of iterations.

Article

Radio Frequency Identification Device (RFID) Applications

A. Rathika* , Hema P**
*,** Lecturer ,Vivekanandha College of Engg for Women ,Tiruchengoda,Tamilnadu ,India
A. Rathika and Hema P (2006). Radio Frequency Identification Device (RFID) Applications. i-manager’s Journal on Software Engineering, 1(2), 34-39. https://doi.org/10.26634/jse.1.2.772

Abstract

Deploying new applications and technologies in an enterprise is always a huge task right from the technology officers to administrators. They have to balance productivity with costs, and integrate the new with the old. RFID allows business to operate more efficiently, helps supply chain functions, enables in-time inventory control, allows the use of valuable information to boost revenue and cut costs, and make way for better customer service. RFID also provides real-time status and visibility resulting in reduced inventories, improved service levels, lessened loss and waste, and better safety and security. It is a non-line-of-sight identification technology which can be made to work in a non intervention mode for faster data capture over long distances. This technology is revolutionizing the process of automatic identification of objects, and enterprises enjoy real time supply chain visibility.

Research Paper

Reliability Analysis of Component based Software System using Usage Ratio and Severity – A case study

K. Vijayalakshmi* , Ramaraj N**
*Lecturer ,Dept of Computer Science and Engg ,Thiagarajar College of Engg,Madurai,India
**Principal ,G.K.M. College of Engg&Tech ,Chennai,India
K. Vijayalakshmi and Ramaraj N (2006). Reliability Analysis of Component based Software System using Usage Ratio and Severity – A case study. i-manager’s Journal on Software Engineering, 1(2), 40-45. https://doi.org/10.26634/jse.1.2.776

Abstract

Today, complex, high-quality computer based software are required to be built in very periods. So, it has motivated utilizing off-the-shelf software components for rapid development in the field of software development. Computer based software Engineering is a process that emphasizes the design and construction of computer based systems using software components. The goal of component based engineering is to increase the productivity, quality and time to market in software development. Component based software applications are expected to have high reliability as a result of deploying trusted components. In this paper, an approach is presented for system reliability assessment if the component reliability is known. This includes probability of failure of the component and its usage ratio to find the system reliability. The reliability of a software component is a probability prediction for failure free execution of the component based on its usage requirements. The component severity analysis is also done to support a software development organization to obtain the best reliability improvements.

Research Paper

Developing of a Protocol for Management of Misbehavior Nodes in the ADHOC Wireless Networks

Subramanyam M.V* , K. Satya Prasad**
*HOD of ECE Dept ,R.G.M.College of Engg and Tech.,Nandyal-518501 ,Kurnool Dist ,A.P,India
**Principal and Prof of ECE ,JNTU College of Engg ,Kakinada,W.Godavari Dist,India.
Subramanyam M.V and K. Satya Prasad (2006). Developing of a Protocol for Management of Misbehavior Nodes in the ADHOC Wireless Networks. i-manager’s Journal on Software Engineering, 1(2), 46-51. https://doi.org/10.26634/jse.1.2.780

Abstract

The goal of an ad-hoc wireless network is to enable communication between any two wireless connected nodes in the network. Using intermediate nodes in the network as forwarding agents enables communication between nodes that are beyond direct communication range. Ad hoc wireless networks are also more prone to security threats and misbehaving nodes. One more problem that we cannot predict in the ad hoc wireless network is misbehaving nodes. Generally a good network must contain all the nodes. If any node that has a strong motivation to deny packet forwarding to others, while at the same time using their services to deliver own data, leads to more complication and problems in the ad hoc wireless networks. These nodes are some time called as selfish nodes. In this paper we propose a Path Management Protocol on Ad hoc Wireless Network (PMP-ANT) designed based on the MARI topology, to cope with misbehavior nodes. In this approach we use PMP-ANT protocol to detect misbehaving nodes and to isolate them from the network, so that misbehavior will not pay off but result in denied service and thus cannot continue. PMP-ANT detects misbehaving nodes by means of direct observation and second-hand information about several types of misbehavior, this allowing nodes to route around these misbehaving nodes and to isolate them from the network.

In the proposed approach each node has a monitor for observations, reputation records for first-hand and trusted record for second-hand information about routing and forwarding behavior of other nodes. The trust records are used to identify the trust value given to the received second-hand information, and a path manager to adapt their behavior according to reputation and to take action against misbehaving nodes.

Research Paper

Key Exchange Protocol Supporting Mobility and Multihoming

Mohammed A. Tawfiq* , Sufyan T. Faraj Al-Janabi**, Abdul-Karim A. R. Kadhim***
*,**,*** College of Nahrain University ,Baghdad ,Iraq
Mohammed A. Tawfiq, Sufyan T. Faraj Al-Janabi and Abdul-Karim A. R. Kadhim (2006). Key Exchange Protocol Supporting Mobility and Multihoming. i-manager’s Journal on Software Engineering, 1(2), 52-70. https://doi.org/10.26634/jse.1.2.824

Abstract

In this work, a new key exchange protocol for IP-based mobile networks is introduced. This protocol is called KEPSOM (Key Exchange Protocol Supporting Mobility and Multihoming). The goals of designing KEPSOM are to develop key exchange protocol proposal characterized by its secrecy, simplicity, efficiency, resistivity, and its ability to support mobility and multihoming. The protocol requires only two roundtrips. The design limits the private information revealed by the initiator. An old security association (SA) can be replaced with a new one by rekeying without the need of restarting the protocol with a new session. On the other hand, the changes in IP address due to mobility or multihoming need not to restart the protocol with a new SA session. The proposed protocol can also support key exchange in hybrid wireless network, in which the mobile node can operate in both Ad Hoc and Base Station-oriented wireless network environments using different transmission modes. KEPSOM has been analyzed and proven secure. Several tests have been done to measure and evaluate the performance of the protocol. In these tests, it is found that the required time for rekeying is about 27% of the total required time for exchanging the keys. And the required time to detect and update the change in IP address, which may occur due to mobility or multihoming, is less than 10% of the total required time to establish a new SA session.

Research Paper

Performance Enhancement to WCDMA Multimedia Network using MAC Protocol

S.Vasundara* , Venkatesh D**, Sathyanarayana***
*Assistant Professor, Dept of CSE ,JNTU College of Engg ,Anantapur,A.P
**Assistant Professor, Dept of CSE ,Gates Instiute of Technology ,Gooty,Anantapur,A.P
***Associate Professor,&Head ,Dept of CSE ,S.K. University ,Anantapur,A.P
Vasundara S, Venkatesh D and Sathyanarayana A (2006). Performance Enhancement to WCDMA Multimedia Network using MAC Protocol. i-manager’s Journal on Software Engineering, 1(2),71-78. https://doi.org/10.26634/jse.1.2.830

Abstract

A medium access control (MAC) protocol is developed for wireless multimedia networks based on frequency division duplex (FDD) and widebank code division multiple access (WCDMA). This protocol isolates the communication channel to three distinct channels namely random access channel (RACH) for control packet transmission, dedicated channel (DCH) for point to point data transmission and broadcast control channel (BCCH) for system information transmission. In this protocol, a minimum-power allocation algorithm controls the received power levels of simultaneously transmitting users such that the heterogeneous bit error rates (BERs) of multimedia traffic are guaranteed. With minimum power allocation, a multimedia wideband CDMA generalized processor sharing (GPS) scheduling scheme is proposed. It provides fair queuing to multimedia traffic with different QoS constraints. It also takes into account of the limited number of code channels for each user and the variable system capacity due to interference, experienced by users in a CDMA network. The admission of real-time connections is determined by a new effective bandwidth connection admission control (CAC) algorithm, in which the minimum-power allocation is also considered. Simulation results show that the new MAC protocol guarantees QoS requirements of both real-time and non-real-time traffic in an FDD wideband CDMA network.

Research Paper

Genetic Algorithm based Image Compression: An analysis of Cross over Operators

Jayanthi V.S* , Shamugam A**
*Assistant Professor of Electronics and Communication Engineering Department, in Sri Ramakrishna | Engineering College, Coimbatore,TamilNadu, India.
** The Principal, Bannari Amman institute of Technology, Sathyamangalam, Tamilnadu.
Jayanthi V.S and Shamugam A (2006). Genetic Algorithm based Image Compression: An analysis of Cross over Operators. i-manager’s Journal on Software Engineering, 1(2), 79-85. https://doi.org/10.26634/jse.1.2.835

Abstract

Vector quantization (VQ) plays an important role in data compression and it has been successfully used in image compression. In VQ, minimization of Mean Square Error (MSE) between code book vectors and training vectors is a non-linear problem. Traditional LBG types of algorithms used for designing the codebooks for vector quantizer converge to a local minimum, which depends on the initial code book. Genetic algorithms (GAs) are a powerful set of global search techniques that have been shown to produce very good results on a wide class of problems. GAs are capable of exploring and exploiting promising regions of the search space. The genetic algorithm can be applied to generate a better codebook that approaches the global optimal solution of vector quantization. In this paper we present a new approach to image compression based on genetic algorithm for vector quantizer. We also propose a composite crossover operator for generating a codebook. The effectiveness of the proposed crossover operator on the design of a codebook of genetic algorithm based vector quantization is analyzed. Simulations indicate that vector quantization based on genetic algorithm has better performance in designing the optimal codebook for vector quantizer than conventional LBG algorithm. The result also indicate that the performance of the codebook is substantially improved by using the proposed cross over operator. The Peak Signal to Noise Ratio (PSNR) is used as an objective measure of reconstructed image quality.

Research Paper

Web Enabled Centralized Hub for Wireless Communication (Using MCF 5272 – 32 Bit Motorola Micro Controller)

Suresh D.S* , Udayashankara V**
*Channabasaveshwara Institute of Technology ,Gubbi ,Karnataka,India.
**S.J.C. Mysore ,Karnataka,India.
Suresh D.S and Udayashankara V (2006). Web Enabled Centralized Hub for Wireless Communication (Using MCF 5272 – 32 Bit Motorola Micro Controller). i-manager’s Journal on Software Engineering, 1(2), 86-95. https://doi.org/10.26634/jse.1.2.838

Abstract

Intelligent electronic systems provide several facilities such as, capturing, storing and communicating a wide range of sensitive and personal data. Security is emerging as a critical concern that must be addressed in order to enable several current and future applications. This paper presents the design and development aspects of a wireless hub & finger print recognition system for authentication. Since a finger print pattern is unique, the information obtained from the system provides most reliable security.