i-manager's Journal on Software Engineering (JSE)


Volume 2 Issue 2 October - December 2007

Article

Comparison between Traditional and Component Based Design Framework for Web Applications –A Qualititative approach

R. Thirumalai Selvi* , Adham Sheriff A **, Balasubramanian N.V***, George T. Manohar****
*Sr; Lecturer, Department of Computer Applications, Velammal Engineering College, Chennai, India.
**Lecturel:; Department of Computer Applications, Velammal Engineering College, Chennai, India
*** Professo Department of Computer Science & Engineering RMKEngineefing College, Chennai, India.
****Rtd Professor, Department of ElectficalEngineering IlTMadras, Chennai, India.
Thirumalai Selvi R, Adham Sheriff A, Balasubramanian N.V and George T. Manohar (2007). Comparison between Traditional and Component Based Design Framework for Web Applications –A Qualititative approach. i-manager’s Journal on Software Engineering, 2(2),1-5. https://doi.org/10.26634/jse.2.2.514

Abstract

A framework is a set of common and prefabricated software building blocks that programmers can use, extend or customize for specific computing solutions. Frameworks are built from collection of objects so both the design and code of the framework may be reused. A framework does several things it makes it easier to work with complex technologies; it ties together a bunch of discrete objects/components into something more useful. A number of frameworks are available in the Open Source community. In this paper we discuss about the comparison between two various frameworks like ASP and TYPO3.One is a traditional and another one is a component-based framework. The paper presents the findings of a small study of projects undertaken by students in the web publishing area. Finally we come to a conclusion that while developing web application, which is the best framework.

Article

Network Security Using Flow Based Intrusion Detection System

Jeya S* , Ramar K**
*Associate Professor KS.R. College of Engineering, Thiruchengode, Namakkai,TamiiNadu, India
**Prof. & HOD/CSE Department,National Engineering College,Kovilpatti TamiiNadu, India.
Jeya S and Ramar K (2007). Network Security Using Flow Based Intrusion Detection System.i-manager’s Journal on Software Engineering, 2(2), 6-13. https://doi.org/10.26634/jse.2.2.536

Abstract

Flow based intrusion detection system is one of the network security system. In addition, guidelines to properly configure and setup network device to minimize the possibilities that network attacks come from inside are also proposed. As the Internet becomes the platform of daily activities, the threat of network attack is also become more serious. Firewall along is not able to protect the system from being attacked through normal service channel. Furthermore, most of the current intrusion detection system focuses on the border of organization network. If the attack comes from inside, this setup does not provide protection to hosts in the local network and the network itself: Therefore, we need to use other mechanism to protect the critical system as well as the network itself: We propose an inexpensive and easy to implement way to perform the anomaly type intrusion detection based on the Net Flow data exported from the routers or other network probes. Our system can detect several types of network attack from inside or outside and perform counter maneuver accordingly.

Article

Framework for Effective ANN and SVM Based Learning for Fast Multimedia Content-Based Retrieval

Ankush Mittal *
*Department of Electronics and Computer Engineering, Indian Institute of Technology, Roorkee, India.
Ankush Mittal (2007). Framework for Effective ANN and SVM Based Learning for Fast Multimedia Content-Based Retrieval. i-manager’s Journal on Software Engineering, 2(2), 14-21. https://doi.org/10.26634/jse.2.2.541

Abstract

Recently, strategies involving learning a supervised model are emerging in the field of multimedia content-based retrieval. When there are clearly identified categories, as well as, large domain-representative training data, learning can be effectively employed to construct a model of the domain.

In this paper, an adequately domain-independent approach is presented where local features can characterize multimedia data using Artificial Neural Networks (ANN) and Support Vector Machines (SVM). The classification in content-based retrieval requires non-linear mapping of feature space. This can normally be accomplished by ANN and SVM. However, they inherently lack the capability to deal with meaningful feature evaluation and large dimensional feature space in the sense that they are inaccurate and slow. We overcome these defects by employing discrete Bayesian error based meaningful feature selection. The experiments on database consisting of real video sequences show that the speed and accuracy of SVM can be improved substantially using this technique, while execution time can be substantially reduced for ANN. The comparison also shows that improved SVM turns out to be a better choice than ANN. Finally, it is shown that generalization in learning is not affected by reducing the dimension of the feature space by the proposed method.

Article

A Novel Remote Login Authentication with Smart Card

Shoba Bindu C* , Chandra Sekhar Reddy P**, Satya Narayana B***
* Department of Computer Sclence & Engineering JNTUniverslty Collegeaf Engineering, Anantapur,A.P India.
** Professor Coardinafar,JNTUniversify College of Engineering, Hyderabad A.P India.
*** Department of Computer Science & Applications, SKUniversRy, Anantapur, A.P India.
Shoba Bindu C, Chandra Sekhar Reddy P and Satya Narayana B (2007). A Novel Remote Login Authentication with Smart Card. i-manager’s Journal on Software Engineering, 2(2), 22-27. https://doi.org/10.26634/jse.2.2.545

Abstract

Sun proposed an efficient remote login authentication scheme based on one way hash function in 2000. In 2002, Chien et al. pointed out a deficiency of Sun’s scheme which only realized unilateral authentication and put forward an efficient and practical solution for remote mutual authentication scheme. But, recently Hsu discussed that this scheme was not secure enough since it was vulnerable to the parallel session attack again. In 2005, Liu et al. have proposed an enhancement to Chien et al. scheme. However, their scheme has many security flaws. This paper points out weaknesses of Liu et al. scheme and presents an enhancement to conquer the weaknesses.

Article

Generation of Binary Random Fields for Image Segmentation and Classification Based on Neighborhood Spanning Tree

Naga Raju C * , Vijaya Kumar V**
*Professor & Head of CSE, VRS &YRN College of Engineering &Technology, Chiraia.
**Professor & Head, Department of CSE,RGMC of Engineering&Technology Nandyal
Naga Raju C and Vijaya Kumar V (2007). Generation of Binary Random Fields for Image Segmentation and Classification Based on Neighborhood Spanning Tree. i-manager’s Journal on Software Engineering, 2(2), 28-31. https://doi.org/10.26634/jse.2.2.548

Abstract

In the image processing literature, texture is usually defined in terms of the spatial interactions between pixel values. The aim of texture analysis is to capture the visual characteristics of texture in an analytical form by mathematically modeling these spatial interactions. This allows segmentation of an image into its various textural components, with each component being classified according to how well it fits the mathematical model of a particular texture. This approach requires the number and type of training data sets are used to formalize the criteria by which the texture models become unique from each other, but not necessarily unique from any other textures not included in the training set. If a texture is to be recognized in a scene containing previously unseen textures, then a new approach is required. The texture models need to capture more than just the characteristics required to distinguish one texture from other known textures they need to capture all the unique characteristics of that texture. This paper describes a new method for image segmentation by generating binary random values in the image based on neighborhood spanning tree. This method has produced better result than conventional region based segmentation methods for complex multi resolution images.

Article

An Overview of Ontology-based Semantic Similarity Measures

Selvi P * , Gopalan N.P**
*Ph.D Research Schalar, Departmentaf Computer Science and Engineering NIT,Tiruchirappalli, India.
**** Professor, Department af Computer Applications, National Institute af Technalagy, Tiruchlrappalll, India
Selvi P and Gopalan N.P (2007). An Overview of Ontology-based Semantic Similarity Measures. i-manager’s Journal on Software Engineering, 2(2), 32-37. https://doi.org/10.26634/jse.2.2.550

Abstract

Ontologies are widely used and play important roles in applications related to knowledge management, knowledge engineering, natural language processing, information retrieval, etc. Different semantic measures have been proposed in the literature to evaluate the strength of the semantic link between two concepts or two groups of concepts from either two different ontologies (ontology alignment) or the same ontology. This article presents an off-context study of different semantic measures based on an ontology restricted to subsomption links. We first present some common principles, and then propose a comparative study based on a set of semantic and theoretical criteria are proposed.

Article

A Genetic Algorithmic Approach for Generating Test Cases

Krishnakumari K* , N.Rajganesh**
*,** Lecturer, Computer Science and Engineering A.V.CCollege of Engineering
Krishnakumari K and Rajganesh N (2007). A Genetic Algorithmic Approach for Generating Test Cases. i-manager’s Journal on Software Engineering, 2(2),38-48. https://doi.org/10.26634/jse.2.2.562

Abstract

Testing in diverse software development paradigms is an ongoing problem in software engineering. Many techniques have been devised over the past decades to help software engineers create useful testing suites. Here, the focus is on test case generation for object-orientated software using genetic programming. The automatic creation of test data is still an open problem in object-oriented software testing, and many new techniques are being researched. For object orientated software, the automatic test data generation technique is not sufficient, because besides input data used for testing, it additionally has to produce the right sequences of method calls, and the right artifact, to bring the object under test in the required state for testing. Genetic algorithms have already been used to tackle typical testing problems with success, but the use of genetic programming applied to automatic test case generation is relatively new and promising. This paper shows how genetic algorithms combined with different types of software analysis can create new unit tests with a high amount of program coverage. Together with static analysis, the genetic algorithm is able to generate tests for more real world programs in a shorter amount of time. This new approach is implemented in this design.

Research Paper

Fuzzy Approach for 802.11 Wireless Intrusion Detection

Raouf Ketata* , 0**
*,** Researoh unit on Intelligent Control, Design and Optimization of Complex Systems (/COS), National Sohool of Engineers of Sfax. National Institute of Applied Soienoe andTeohnolog!es ofTunis, Tunlsia
Raouf Ketata and Hatem Bellaaj (2007). Fuzzy Approach for 802.11 Wireless Intrusion Detection. i-manager’s Journal on Software Engineering, 2(2), 49-55. https://doi.org/10.26634/jse.2.2.567

Abstract

This paper proposes a new fuzzy logic approach to perform analysis and detection intrusion in 802.11 wireless networks. The algorithm consists on five steps: First, construct our networks and generating much case of daily traffic and intrusion. In same time, catch different values of system and networks parameters and associate with them a potential degree of severity alarm. Second, generate fuzzy rules from numerical data using Mendel Wang method. Third, implement new rule base on each computer and start system. Adjust it for catching parameter cyclically and compute severity alarm. If it detects intrusion, then send massage for every network node. Four, in case of no responding system or error, start learning mechanism by injection of numerical values and generating fuzzy rules.

Research Paper

Multi-Population Genetic Algorithms for Tuning Fuzzy Logic Controller

Munther N. Al-Tikriti* , Rokaia Sh. Al-Joubori**
* Professor of Control Engineering Faculty of Engineering, Philodeiphlo University
** Assistont Lecturer in System Progromming ond Advonced Digifol Systems, Department of Computer ond Software Engineering College of Engineering Ai-Mustonsiryoh Universify
Munther N. Al-Tikriti and Rokaia Sh. Al-Joubori (2007). Multi-Population Genetic Algorithms for Tuning Fuzzy Logic Controller. i-manager’s Journal on Software Engineering, 2(2), 56-63. https://doi.org/10.26634/jse.2.2.593

Abstract

Reducing the time required to reach acceptable solutions was the main goal behind parallel implementation of genetic algorithms (Gas). Starting at this point the presented paper introduces a parallel implementation of multi-population genetic algorithms to tune the fuzzy membership functions of a fuzzy logic controller (FLC) with the goal to improve its performance. The genetically tuned controller is implemented for both linear and nonlinear control systems.

Research Paper

Extension of SSL/TLS for Quantum Cryptography

Sufyan T. Faraj Al-Janabi*
* IEEE, Associate Prof,, College of Computers, University of Anbar, Iraq
Sufyan T. Faraj (2007). Extension of SSL/TLS for Quantum Cryptography. i-manager’s Journal on Software Engineering, 2(2),64-76. https://doi.org/10.26634/jse.2.2.669

Abstract

After a good period of time with experimentation in quantum cryptography (QC) in labs and somewhat for less extent with experience in deploying stand-alone point-to-point commercial QC products, it is definitely prudent now to explore the great advantages of integrating QC with the already-existing Internet security infrastructure. SSL/TLS is the protocol that is used for the vast majority of secure transactions over the Internet. However, this protocol needs to be extended in order to create a promising platform for the integration of QC into the Internet infrastructure. This paper presents a novel extension of SSL/TLS that significantly facilitates such type of integration. This extended version of SSL/TLS is called QSSL (Quantum SSL). During the development of QSSL, a concentration has been made on the creation of a simple, efficient, general, and flexible architecture that enables the deployment of practical quantum cryptographic-based security applications. Indeed, QSSL efficiently supports unconditionally secure encryption (one-time pad) and/or unconditionally secure authentication (based on universal hashing). A simplified version of QSSL based on BB84 (Bennett-Brassard 84) quantum key distribution (QKD) protocol has been implemented and experimentally tested. This has enabled us to experimentally assess our protocol design based on software simulation of the quantum channel events used for QKD.

Research Paper

Quality and Defect Management of Software in Real Time Software Systems – A Case Study

K. Vijayalakshmi* , Ramaraj N**
* Senior Lecturer, Department of Computer Sc!ence and Engineering, Dr. Mahalingam College of Engineering and Technology Pollachi.
** Principal, G.K.M College of Engineering, Chennai.
K. Vijayalakshmi and Ramaraj N (2007). Quality and Defect Management of Software in Real Time Software Systems – A Case Study. i-manager’s Journal on Software Engineering, 2(2), 77-88. https://doi.org/10.26634/jse.2.2.671

Abstract

Nowadays more automation systems are coming with advanced technologies for various applications. An automation system consists of software and hardware components to achieve the good quality products and processes. In automation system, software and hardware quality has a continuously increasing impact on system reliability. The reliability defines as “the ability of a system or component to perform its required functions under stated conditions for a specified period of time”. If the software is inaccurate and less reliability means the system performance will poor and influenced greatly. If the hardware reliability is less, the system will meet the failures very often and leads the increase of maintenance cost. So the reliability estimation and improvement of software and hardware is an important task to improve the quality of automation systems.

But, this paper aim is to identify and analyze the most important mode which is very crucial in the software based automation system. Prioritization of the critical mode is identified using Failure Mode and Effect Analysis (FMEA) according to its Risk Priority Number (RPN) and then the reliability of particular mode is identified. The reliability estimation will also help to identify the performance of the software based systems. A new direction in defect management system is proposed to improve the reliability of real time system.

Research Paper

LDMA: Load Balancing Using Decision Making Decentralized Mobile Agents

Aramudhan M *
*Assistant Professor, Department of lnformationTechnology, Velammal Engineering College, Chennai
Aramudhan M (2007). LDMA: Load Balancing Using Decision Making Decentralized Mobile Agents. i-manager’s Journal on Software Engineering, 2(2), 89-95. https://doi.org/10.26634/jse.2.2.676

Abstract

This paper introduces a new load balancing algorithm, called LDMA (Load balancing using decision making Decentralized Mobile Agents), which distributes load among clustered web servers organized in a mesh topology, by a communications network and compares its performance with other load balancing algorithm: MALD (Mobile Agent based LoaD balancing).Architecture is developed for the same and all necessary attributes such as load deviation, system throughput and average response time incurred as a result of the work are dealt with. In past works, a centralized decision making algorithms were used for dispatching the requests among the web servers in the distributed client/server environment. In the proposed approach, a decentralized decision making approach is used for distributing requests among the web servers. A simulator is developed in C++ and its performance is evaluated. The analysis shows that LDMA is better than MALD load balancing algorithm.

Research Paper

EQoS: Energy Efficient QoS Protocol for Wireless Sensor Networks

S. Anandamurugan* , Venkatesh C **
.*Lecturer, KonguEnglneerlng College, Perundural, Erode, India
**Professol: KonguEnglneerlng College, Perundural, Erode India.
Anandamurugan S and Venkatesh C (2007). EQoS: Energy Efficient QoS Protocol for Wireless Sensor Networks. i-manager’s Journal on Software Engineering, 2(2),96-101. https://doi.org/10.26634/jse.2.2.681

Abstract

The problem of maintaining the quality of service above a desired level through the wireless sensor network while conserving energy at the same time. We propose a two phase protocol, EQos, for this purpose. The first phase of the protocol creates a virtual communication backbone and enables a subset of nodes, termed as (leaves), to turn off their radios and learn their rough coordinates. Since energy efficiency should be the goal of such a constrained network in every aspect beside communication, we introduce a distributed algorithm based on rough coordinate and local neighborhood information in order to turn redundant sensor hardware off in the second phase of the protocol. EQos, which is both applica­ble for homogeneous and heterogeneous networks, has shown significant improvements during simulation experiments and proved to be effective for wireless sensor networks. The main aim of this paper is to conserve the energy i.e. battery power during the sensing, transmitting and receiving the information by distributed algorithm.