Design and Evaluation of Parallel Processing Techniques for 3D Liver Segmentation and Volume Rendering
Ensuring Software Quality in Engineering Environments
New 3D Face Matching Technique for an Automatic 3D Model Based Face Recognition System
Algorithmic Cost Modeling: Statistical Software Engineering Approach
Prevention of DDoS and SQL Injection Attack By Prepared Statement and IP Blocking
Due to the growing necessities for transmission of images through Internet, the studies in the area of photo compression has multiplied significantly. Image compression performs a vital role in virtual photo processing, and it is also very crucial for green transmission and storage of pictures. When computing the quantity of bits with photo attributable to usual sampling and quantization methods, image compression is needed. In this paper, the image compression has been implemented using block truncation method simulated with MATLAB software. The peak signal to noise ratio (PSNR), bitrate for the Block Truncation Coding (BTC) and Enhanced Block Truncation Coding (EBTC) technique has been studied.
In various applications, deep neural networks has produced impressive accuracy, though the success rate is often attributed on heavy network architectures. In this way, a portable student network with significantly fewer parameters can achieve considerable accuracy, compared to a teacher network. But, in addition to the fact that the accuracy, reliability, and timeliness of the learning student network is also important for practical use. In this paper, we make the model to predict the maximum efficiency using the combination of student-teacher network to predict the occurrence of COVID-19 in a person. Three different architectures are used and compared to find out which one gives the highest accuracy. A web application is developed as a frontend for displaying the test results. Here, the different combination of blocks are trained in parallel and finally concatenation of the blocks are done to improvise the efficiency of the model which in turn predicts the presence of COVID-19 among the patients effectively with an accuracy of about 90% at low cost.
COVID-19 (SARS CoV-2) has already claimed more than 3.3 million lives worldwide. Many patients of COVID-19 are suffering from Mucormycosis. The use of technology helps to improve accuracy, saves time and energy to make precise decisions that will save lives of patients. Day by day new problems arise and IoT, Cloud Computing, Big Data, Machine Learning, Artificial Intelligence, etc. have been helping to analyze new problems more quickly and supports better design strategies. The sensor-based devices feeds the data to process, analyze and provide a solution. Machine Learning Algorithms are used to make precise decisions with the help of existing data repository with similar kinds of symptoms in patients, history of allergies, surgeries, and treatment. This proposed model will help to guide medical professionals to provide solutions to the problems arising in the treatment of COVID-19 and post-treatment diseases like Mucormycosis (also known as Zygomycosis).
This paper focuses on compiling necessary aspects of deep learning for future researches. Deep learning has gained massive popularity in computing research and application development. Many cognitive problems related to unstructured problems are solved using this technology. Deep learning and Artificial intelligence are the underlying paradigms on popular applications like Google Translator. Machine learning and Deep learning are the two subset of Artificial intelligence. Deep learning algorithms are integrated using different neural network layers replicating the functioning of the human brain. Deep learning uses layers of neural networks to learn data in a recursive manner from training data from structured datasets and uses this data to predict unstructured data. Deep learning has three layers, namely input layer, output layer and hidden layer. Neural networks are often usual for the image recognition. Big Data powered with Deep learning can drive innovations beyond imagination in future.
New computer-assisted interactive learning methods and devices like intelligent tutoring systems, simulations, and games have increased the possibility of collecting and analysing student data, discovering patterns and trends in that data, and developing and testing new hypotheses about how students learn. In this paper, data mining techniques like Naïve Bayes method, Random Forest method, J48, Support Vector Machine and C4.5 classifier have been discussed. Each algorithm has its own advantages and disadvantages. Decision tree technique do not execute well if the data has smooth boundaries. The Naive Bayesian classifier works with both continuous and discrete attributes and operates well for real time problems. The objective of this review paper is to identify the appropriate technology that could be used to for data mining the database of the computer assisted learning tools to predict the right carrier for the students through their responses and interactions. This paper has focused on the probability of constructing a classification model for identifying student talents. Numerous attributes are tested, and a number of them have been found powerful on the performance identification.