Design and Evaluation of Parallel Processing Techniques for 3D Liver Segmentation and Volume Rendering
Ensuring Software Quality in Engineering Environments
New 3D Face Matching Technique for an Automatic 3D Model Based Face Recognition System
Algorithmic Cost Modeling: Statistical Software Engineering Approach
Prevention of DDoS and SQL Injection Attack By Prepared Statement and IP Blocking
Timing analysis is a very important part of the digital logic design procedure. As the complexity of the system increases the possibility of timing issues adversely affecting the system's functionality increases and the designer there after seeks use of computer aided software to assist in resolving such issues existing in the system. One major issue encountered in digital electronic system is that of static 1 hazards. A static-1 hazard is a possibility of a zero (0) glitch when a steady logic 1 output is expected. This project entails the development of a Graphical User Interface (GUI) for construction of combinational logic circuits, which allow for the identification and elimination of Static-1 Hazards. This tool will be used as a teaching aid. The user interface would be a menu-driven program written using Matlab to allow the user to easily identify and eliminate static-1 hazards from digital logic circuits and the algorithm selected to implement this feature was adopted from the consensus theorem. The learning is confined to a 3-input variable logic circuit design. The tool allows the user to design their own combinational circuits, then generate the corresponding truth table and Karnaugh map with its Sum-of-Product (SOP) expression. All existing static-1 hazards were illustrated on Karnaugh maps and the solution to eliminate the hazards by adding the consensus term to the SOP expression was demonstrated.
Inter-organizational learning and knowledge sharing management play an important role during requirements collection and implementation for any software system. In Global Software Development (GSD), its significance increase more as stakeholders are far away across the globe. In GSD where critical challenges such as language differences, geographical distance, culture differences and time zone differences exist, the need of interorganizational learning and knowledge sharing increases. This study aims to propose Inter-organizational learning and Knowledge Sharing Management Model (ILKSM) in Global Software Development to assist vendors to learn and share knowledge on successful implementation of Software Engineering. As preliminary results, we have identified 13 practices of ILKSM with SLR. Among these practices, 'effective communication' is critical factor identified with highest frequency of occurrence while other critical practices in order of frequency of occurrence are 'proper negotiations', 'Frequent meetings improve awareness among distributed site', 'Clear organizational structure', experienced team members should be accessible' and 'modern tools and technologies should be implemented'.
The main goal of software engineering is to develop an economical high- quality software product. The quality of a software product plays a critical role in decision making and impact the success of a business. To make a software product qualitative attributes need to be identified and then correlated to its quantitative measures. Hence a software metric values specify the structural complexity of a software product. In this research paper, six metric values such as Line of Codes (LOC), Cyclomatic Complexity (MVG), Lines of Comment (COM), Number of Methods (NOM), Coupling Between Objects (CBO) and Halstead Volume (HV) have been applied to investigate java-based two searching and five sorting programs. Three software estimation tools have been applied to them to obtain a conclusion on their implementation pertaining to referenced metrics. Additionally, a maintainability Index metric has been determined from the base metric to demonstrate comparative maintainability of the source code. This relative study demonstrates the analysis of the result for the same programs. Correctness, Fault-proneness, Modularity and threats of Validity are also discussed in this study.
The internet usage increases every day, shrinking the world to come closer and making it a smaller place for its users to live. However, this increase in usage has created problems for the users with the increase in cyber-crimes. This, therefore, has induced a need for monitoring and analyzing system activities by users and has enforced tracking and blocking malwares. This is where Intrusion Detection System (IDS) and Intrusion Prevention System (IPS) comes into the picture. IDS and IPS have a substantial impact on the society in reducing the number of cyber-crimes. This provides a platform to facilitate basic security amenities for small and medium enterprises and emerging entrepreneurs. Intrusion prevention systems generally include methodologies like analyzing signatures, statistical anomalies and fingerprints. These methods detect malwares and further actions are taken to block the malware. IPS techniques differ on how they scan the data streams to detect a threat or intrusion. Research community to studies network security issues by data capture and data control, such as internet worms, spam control, and Denial of Service (DoS) attacks. This paper will be focusing on detection and prevention cyber of attacks on websites.
In recent electronic era, computer networks are substantially evolved because of the rapid development in electronic communication, Internet of Things and Cyber Physical system. In electronic communication technologies, large amount of data is exchanged. As a result, these technologies are prone to several electronic attack, malicious actions, many security threats which can compromise the integrity and availability of information. To overcome these issues, an intrusion detection system is of significant importance in computer network. It is used for security and protection of the various communication infrastructures. For evaluating the performance of various intrusion detection systems, a suitable technique needs to be identified for the application specific dataset. It is very important to study the features of chosen dataset to increase accuracy and decrease training of intrusion detection model. Many researchers use different approaches of feature selection such as principal component analysis, hybrid techniques and chi square methods to decrease training time.
In this paper, an intelligent Network Intrusion Detection is implemented using Support Vector Machine Classifier. NSL KDD dataset is used for training and separate test data to evaluate the performance of the trained model. Different hyper parameter of Support Vector Machine viz. Y and C are used to tuned the model. The performance of this classifier on principal component analysis transformed dataset as well non-transformed dataset is studied and compared. The experimental results show that support vector machine trained on transformed dataset using Principal Component Analysis exhibits 2% less accuracy as compared with classifier trained on non-transferred dataset. However, classifier trained on transformed dataset using Principal Component Analysis take 15% less training time as compared to classifier trained with non-transferred dataset. The lesser accuracy of the Principal Component Analysis transformed data could be interpreted from the explanation of the variance obtained for top Principal Components as they do not capture the linear separation clearly between the two classes.