Blockchain Scalability Analysis and Improvement of Bitcoin Network through Enhanced Transaction Adjournment Techniques
Data Lake System for Essay-Based Questions: A Scenario for the Computer Science Curriculum
Creating Secure Passwords through Personalized User Inputs
Optimizing B-Cell Epitope Prediction: A Novel Approach using Support Vector Machine Enhanced with Genetic Algorithm
Gesture Language Translator using Morse Code
Efficient Agent Based Priority Scheduling and LoadBalancing Using Fuzzy Logic in Grid Computing
A Survey of Various Task Scheduling Algorithms In Cloud Computing
Integrated Atlas Based Localisation Features in Lungs Images
A Computational Intelligence Technique for Effective Medical Diagnosis Using Decision Tree Algorithm
A Viable Solution to Prevent SQL Injection Attack Using SQL Injection
As countries of the world are facing unimaginable challenges from the novel coronavirus COVID-19 pandemic, the strains on governments are extreme; and these have greatly impacted people everywhere. In response to the pandemic, captains of industries have implemented pandemic/communicable disease response plans, assessed and/or initiated business continuity plans, and widespread telework operations. Nigeria's government has also implemented lots of guidelines and these include, social distancing, shutting or locking down of restaurants, bars, movie theaters, gyms, and even worship centres. As a nation, Nigeria has been affected in many areas, for example, the educational system has been truncated, activities of manufacturing industries halted, and technologically, the nation is not moving forward. As a result, this paper aims to examine the role of Information Technology (IT) in the COVID-19 pandemic recovery strategy in Nigeria. The paper carried out a thorough analysis of the COVID-19 pandemic, its economic impact; and how government and private sectors can help in restoring the nation's economy by maximizing the use of Information Technology (IT) in Nigeria. Studies were carried out on the pandemic spread with emphasis on attributes, such as, confirmed, recovered, and death cases. Data was extracted from UNDP and WHO recorded data of December 12, 2020. The study shows that the growth of IT has increased steadily in the past nine (9) months with an appreciable increase in its teledensity, info density, and compudensity.
Wireless Sensor Network (WSN) is a developing area of research. In this development of sensor network applications, monitoring data volume, monitoring the well-being of the wireless sensor node, data manipulation and representation demonstrates a range of challenges and has become a critical component of sensor networks. Wireless Sensor Networks (WSNs), which comprise of spatially distributed self-configurable sensors, impeccably meet the prerequisite. Since running real experiments is expensive and tedious, recreation is basic to contemplate WSNs, being the normal method to test new applications and conventions in the field, it requires a reasonable model dependent on strong assumptions and a suitable framework to ease usage. Moreover, recreation results depend on the specific situation under investigation (environment), equipment and physical layer, which are not typically exact enough to catch the real behavior of a WSN, accordingly, risking the validity of results. In any case, because of the enormous number of hubs that need to be replicated depending on the application, identified models must be evaluated for scalability and execution problems. The goal of this study is to present a definite review of various simulator tools for WSNs which will help further research in the field.
The design of the real-time operating system (RTOS) is quite critical, particularly if any unique systems choose to use it. Since they are capable of facilitating the application of multiple requirements, including clustering, cohesion, and substitute programs. Many publications were reviewed in this paper to track the efficiency of the RTOS Various restrictions as it is exposed to. The research concentrates on a play analysis of RTOS Models for research on different computer devices and operating systems. The magazines We also gathered it for a rigorous analysis leading to the setup. Many variables that impact device features. Statistics and findings are equally relevant to Encourage the implementation of a more oriented RTOS strategy. During this phase The software classifies clustering and performance for all applications as the highest RTOS standards, This was viewed as the least significant by alternative programs. Thus, the preference of parameters is a major problem to contend with.
Linux is a well-known operating system (os) in today's world. It has not only created a name but has also created a whole market for itself. It is today competing with Microsoft Windows which is in fact a giant, os in the computer world. Linux finds its utility in many machines other than computers for example: Servers, Routers, automation controls, etc. Every operating system that existed always needed and always had a way of securing the data that was entrusted to it. Commonly, this kind of security is referred to as a 'Firewall'. Firewalls come with different settings to ensure maximum security and to put up a defense against getting hacked or if considering a server, to fight against a Denial-of-Service attack (DoS). Linux provides such security in the form of a firewall. Firewall filters out the data packets using the rules that were created by the administrator. In firewall, the component that handles the filtering is the net filter/iptables. iptables assist in filtering the packets into different hooks which basically belong to three different categories either input, forward or output. This study discusses the implementations that were devised and would discuss in particular implementations that were able to provide a notable increase in fulfilment of securing and handling data packets at a faster pace while dealing with a high number of rules for packet handling.
We have conducted a survey in the filed of cluster computing with intention to see its applications in data centers, advantages and disadvantages. Although the field of cluster computing is not novel, however a brief and comprehensive survey is conducted to help novice users to get a fair idea about data centers, cluster computing its applications, advantages and disadvantages. A thorough study was carried out from available resources, i.e., journals, conferences and websites to help conclusion. A number of free COTS (Commercial-Off-The-Shelf) software are also available to get cluster computing done in any given environment. There is still a need to produce more middle-ware software for cluster computing to overcome the heterogeneous hardware environment.