Design and Evaluation of Parallel Processing Techniques for 3D Liver Segmentation and Volume Rendering
Ensuring Software Quality in Engineering Environments
New 3D Face Matching Technique for an Automatic 3D Model Based Face Recognition System
Algorithmic Cost Modeling: Statistical Software Engineering Approach
Prevention of DDoS and SQL Injection Attack By Prepared Statement and IP Blocking
In the recent years, Multi-core to Many-core processors computing became most significant in High Performance Computing (HPC). Increasing parallelism rather than increasing clock rate has become the primary engine of processor performance growth and this trend is likely to continue. Particularly, today's Graphics Processing Units (GPU) became most significant in favour of HPC. General Purpose GPU (GPGPU) computing has allowed the GPU to emerge as successful co-processors that can be employed to improve the performance of many different non-graphical applications with high parallel requirements that make them suitable for many HPC workloads. CUDA and OpenCL offer two different interfaces for programming GPUs. OpenCL is an open standard that can be used to program CPUs, GPUs, and other devices from different vendors, while CUDA is specific to NVIDIA GPUs. In this research paper, the authors have explored the contrast between CUDA and OpenCL, which helps the HPC programmers to familiarize with GPGPU.
With the increased use of internet, cyber threats have increased exponentially. To prevent our system from such threats, we need an anomaly detection system that will inspect all the network activities and identify any suspicious pattern that may indicate breach of security resulting in damage of computing resources. In this paper, the authors are introducing anomaly detection system that uses multilayer perceptron, a model of Artificial Neural Network (ANN). In this system, Multilayer Perceptron uses backpropagation learning algorithm. For training and testing purpose, they have used NSLKDD dataset. The trained model of Multilayer Perceptron is then used for real-time anomaly detection using tcpdump (packet sniffing tool in Linux). This system has successfully achieved a very low false-positive rate.
Software Engineering has new methodologies, technologies, applications and processes. Machine learning involves computer programming solutions that are experienced based learning to improve performance at some task. The overlap between machine learning and software engineering has seen the development of machine learning application to address various problems in software engineering. Faults in software engineering systems are major problems that need to be resolved. Fault prediction in software is significant because it can help in directing test effort, reducing cost, and increasing the quality of software and its reliability. In this study, the authors have analyzed various fault prediction techniques and proposed a new model named DMM (Decision Making Model) based on decision logic to develop the prediction hypothesis by introducing a new algorithm called GGA (Genetic Gain Algorithm) for fault prediction.
Software Engineering is the science of engineering to the design, development and maintenance of software systems. And these software systems undergo a complete life cycle of initiation, development, implementation, maintenance and retirement before being replaced by a better software subsystem. This paper primarily deals with the vital role of software process management that identifies and examines the development models which are also known as software development life cycle namely Waterfall, Iteration, V-Shape and Spiral and then conceptualize a better software process model. These software development life cycles provide a conceptual way of managing the development of software systems. A new better and efficient software process model is being presented in the paper which uses the concept of reusability of software components and will show effect of the same in project scheduling, staffing and project size by using the renowned COCOMO model. Also, there is a comparative analysis between all models along with the proposed model showing how the drawbacks of the models are covered up by the proposed model.
Software systems are getting more and more important for organizations and individuals alike and at the same time they are growing bigger and more complex. Software testing has its own resources, cost and ROI (Return on Investment). Web applications are becoming more and more complex because testing is badly performed or skipped by professionals. Web application testing may be even more difficult, due to the uniqueness of such applications. Test automation is one robust solution, which has been extensively accepted all over the world. It reduce the effort, cost and time of software testing to a reasonable level. The main objective of this paper is to perform test Automation for any kind of E-commerce applications using Selenium Open Source Testing Tool. With this web testing tool, the authors have used Hybrid testing automation framework. Hybrid Testing Framework permits Data driven scripts to make use of the libraries and utilities that always accompany a keyword driven testing. Using this framework, an individual software engineer can easily describe his routine Software engineering tasks and schedule these described tasks by using local machine and global cloud computers in an efficient way. Test cases are designed in excel sheets with the corresponding keywords which are normally written in simple English words. This results to design test cases without the knowledge of any programming languages. Hybrid Testing Framework be able to execute tests, verify results, and recover from expected errors and report results. It must also be simple to use and maintain.