Design and Evaluation of Parallel Processing Techniques for 3D Liver Segmentation and Volume Rendering
Ensuring Software Quality in Engineering Environments
New 3D Face Matching Technique for an Automatic 3D Model Based Face Recognition System
Algorithmic Cost Modeling: Statistical Software Engineering Approach
Prevention of DDoS and SQL Injection Attack By Prepared Statement and IP Blocking
A framework is a set of common and prefabricated software building blocks that programmers can use, extend or customize for specific computing solutions. Frameworks are built from collection of objects so both the design and code of the framework may be reused. A framework does several things it makes it easier to work with complex technologies; it ties together a bunch of discrete objects/components into something more useful. A number of frameworks are available in the Open Source community. In this paper we discuss about the comparison between two various frameworks like ASP and TYPO3.One is a traditional and another one is a component-based framework. The paper presents the findings of a small study of projects undertaken by students in the web publishing area. Finally we come to a conclusion that while developing web application, which is the best framework.
After a good period of time with experimentation in quantum cryptography (QC) in labs and somewhat for less extent with experience in deploying stand-alone point-to-point commercial QC products, it is definitely prudent now to explore the great advantages of integrating QC with the already-existing Internet security infrastructure. SSL/TLS is the protocol that is used for the vast majority of secure transactions over the Internet. However, this protocol needs to be extended in order to create a promising platform for the integration of QC into the Internet infrastructure. This paper presents a novel extension of SSL/TLS that significantly facilitates such type of integration. This extended version of SSL/TLS is called QSSL (Quantum SSL). During the development of QSSL, a concentration has been made on the creation of a simple, efficient, general, and flexible architecture that enables the deployment of practical quantum cryptographic-based security applications. Indeed, QSSL efficiently supports unconditionally secure encryption (one-time pad) and/or unconditionally secure authentication (based on universal hashing). A simplified version of QSSL based on BB84 (Bennett-Brassard 84) quantum key distribution (QKD) protocol has been implemented and experimentally tested. This has enabled us to experimentally assess our protocol design based on software simulation of the quantum channel events used for QKD.
Nowadays more automation systems are coming with advanced technologies for various applications. An automation system consists of software and hardware components to achieve the good quality products and processes. In automation system, software and hardware quality has a continuously increasing impact on system reliability. The reliability defines as “the ability of a system or component to perform its required functions under stated conditions for a specified period of time”. If the software is inaccurate and less reliability means the system performance will poor and influenced greatly. If the hardware reliability is less, the system will meet the failures very often and leads the increase of maintenance cost. So the reliability estimation and improvement of software and hardware is an important task to improve the quality of automation systems.
But, this paper aim is to identify and analyze the most important mode which is very crucial in the software based automation system. Prioritization of the critical mode is identified using Failure Mode and Effect Analysis (FMEA) according to its Risk Priority Number (RPN) and then the reliability of particular mode is identified. The reliability estimation will also help to identify the performance of the software based systems. A new direction in defect management system is proposed to improve the reliability of real time system.
This paper introduces a new load balancing algorithm, called LDMA (Load balancing using decision making Decentralized Mobile Agents), which distributes load among clustered web servers organized in a mesh topology, by a communications network and compares its performance with other load balancing algorithm: MALD (Mobile Agent based LoaD balancing).Architecture is developed for the same and all necessary attributes such as load deviation, system throughput and average response time incurred as a result of the work are dealt with. In past works, a centralized decision making algorithms were used for dispatching the requests among the web servers in the distributed client/server environment. In the proposed approach, a decentralized decision making approach is used for distributing requests among the web servers. A simulator is developed in C++ and its performance is evaluated. The analysis shows that LDMA is better than MALD load balancing algorithm.
The problem of maintaining the quality of service above a desired level through the wireless sensor network while conserving energy at the same time. We propose a two phase protocol, EQos, for this purpose. The first phase of the protocol creates a virtual communication backbone and enables a subset of nodes, termed as (leaves), to turn off their radios and learn their rough coordinates. Since energy efficiency should be the goal of such a constrained network in every aspect beside communication, we introduce a distributed algorithm based on rough coordinate and local neighborhood information in order to turn redundant sensor hardware off in the second phase of the protocol. EQos, which is both applicable for homogeneous and heterogeneous networks, has shown significant improvements during simulation experiments and proved to be effective for wireless sensor networks. The main aim of this paper is to conserve the energy i.e. battery power during the sensing, transmitting and receiving the information by distributed algorithm.
Flow based intrusion detection system is one of the network security system. In addition, guidelines to properly configure and setup network device to minimize the possibilities that network attacks come from inside are also proposed. As the Internet becomes the platform of daily activities, the threat of network attack is also become more serious. Firewall along is not able to protect the system from being attacked through normal service channel. Furthermore, most of the current intrusion detection system focuses on the border of organization network. If the attack comes from inside, this setup does not provide protection to hosts in the local network and the network itself: Therefore, we need to use other mechanism to protect the critical system as well as the network itself: We propose an inexpensive and easy to implement way to perform the anomaly type intrusion detection based on the Net Flow data exported from the routers or other network probes. Our system can detect several types of network attack from inside or outside and perform counter maneuver accordingly.
Recently, strategies involving learning a supervised model are emerging in the field of multimedia content-based retrieval. When there are clearly identified categories, as well as, large domain-representative training data, learning can be effectively employed to construct a model of the domain.
In this paper, an adequately domain-independent approach is presented where local features can characterize multimedia data using Artificial Neural Networks (ANN) and Support Vector Machines (SVM). The classification in content-based retrieval requires non-linear mapping of feature space. This can normally be accomplished by ANN and SVM. However, they inherently lack the capability to deal with meaningful feature evaluation and large dimensional feature space in the sense that they are inaccurate and slow. We overcome these defects by employing discrete Bayesian error based meaningful feature selection. The experiments on database consisting of real video sequences show that the speed and accuracy of SVM can be improved substantially using this technique, while execution time can be substantially reduced for ANN. The comparison also shows that improved SVM turns out to be a better choice than ANN. Finally, it is shown that generalization in learning is not affected by reducing the dimension of the feature space by the proposed method.
Sun proposed an efficient remote login authentication scheme based on one way hash function in 2000. In 2002, Chien et al. pointed out a deficiency of Sun’s scheme which only realized unilateral authentication and put forward an efficient and practical solution for remote mutual authentication scheme. But, recently Hsu discussed that this scheme was not secure enough since it was vulnerable to the parallel session attack again. In 2005, Liu et al. have proposed an enhancement to Chien et al. scheme. However, their scheme has many security flaws. This paper points out weaknesses of Liu et al. scheme and presents an enhancement to conquer the weaknesses.
In the image processing literature, texture is usually defined in terms of the spatial interactions between pixel values. The aim of texture analysis is to capture the visual characteristics of texture in an analytical form by mathematically modeling these spatial interactions. This allows segmentation of an image into its various textural components, with each component being classified according to how well it fits the mathematical model of a particular texture. This approach requires the number and type of training data sets are used to formalize the criteria by which the texture models become unique from each other, but not necessarily unique from any other textures not included in the training set. If a texture is to be recognized in a scene containing previously unseen textures, then a new approach is required. The texture models need to capture more than just the characteristics required to distinguish one texture from other known textures they need to capture all the unique characteristics of that texture. This paper describes a new method for image segmentation by generating binary random values in the image based on neighborhood spanning tree. This method has produced better result than conventional region based segmentation methods for complex multi resolution images.
Ontologies are widely used and play important roles in applications related to knowledge management, knowledge engineering, natural language processing, information retrieval, etc. Different semantic measures have been proposed in the literature to evaluate the strength of the semantic link between two concepts or two groups of concepts from either two different ontologies (ontology alignment) or the same ontology. This article presents an off-context study of different semantic measures based on an ontology restricted to subsomption links. We first present some common principles, and then propose a comparative study based on a set of semantic and theoretical criteria are proposed.
Testing in diverse software development paradigms is an ongoing problem in software engineering. Many techniques have been devised over the past decades to help software engineers create useful testing suites. Here, the focus is on test case generation for object-orientated software using genetic programming. The automatic creation of test data is still an open problem in object-oriented software testing, and many new techniques are being researched. For object orientated software, the automatic test data generation technique is not sufficient, because besides input data used for testing, it additionally has to produce the right sequences of method calls, and the right artifact, to bring the object under test in the required state for testing. Genetic algorithms have already been used to tackle typical testing problems with success, but the use of genetic programming applied to automatic test case generation is relatively new and promising. This paper shows how genetic algorithms combined with different types of software analysis can create new unit tests with a high amount of program coverage. Together with static analysis, the genetic algorithm is able to generate tests for more real world programs in a shorter amount of time. This new approach is implemented in this design.
This paper proposes a new fuzzy logic approach to perform analysis and detection intrusion in 802.11 wireless networks. The algorithm consists on five steps: First, construct our networks and generating much case of daily traffic and intrusion. In same time, catch different values of system and networks parameters and associate with them a potential degree of severity alarm. Second, generate fuzzy rules from numerical data using Mendel Wang method. Third, implement new rule base on each computer and start system. Adjust it for catching parameter cyclically and compute severity alarm. If it detects intrusion, then send massage for every network node. Four, in case of no responding system or error, start learning mechanism by injection of numerical values and generating fuzzy rules.
Reducing the time required to reach acceptable solutions was the main goal behind parallel implementation of genetic algorithms (Gas). Starting at this point the presented paper introduces a parallel implementation of multi-population genetic algorithms to tune the fuzzy membership functions of a fuzzy logic controller (FLC) with the goal to improve its performance. The genetically tuned controller is implemented for both linear and nonlinear control systems.