Detection of DDoS Attack in Cloud Computing and its Prevention: A Systematic Review
Awareness of Cloud Computing among the Prospective Teachers in the Modern World
Enhancing Replica Management in a Cloud Environment using Data Mining Based Dynamic Replication Algorithm
Elasticity in the Cloud Related to Database Autonomies and Scalability
Cloud Computing Rising in the Field of Big Data and Artificial Intelligence
A Comprehensive Review of Security Issues in Cloud Computing
An Extended Min-Min Scheduling Algorithm in Cloud Computing
Data Quality Evaluation Framework for Big Data
An Architectural Framework for Ant Lion Optimization-based Feature Selection Technique for Cloud Intrusion Detection System using Bayesian Classifier
Be Mindful of the Move: A Swot Analysis of Cloud Computing Towards the Democratization of Technology
GridSim Installation and Implementation Process
Genetic Algorithm Using MapReduce - A Critical Review
Encroachment of Cloud Education for the Present Educational Institutions
A Survey on Energy Aware Job Scheduling Algorithms in Cloud Environment
Clustering based Cost Optimized Resource Scheduling Technique in Cloud Computing
This paper discussed the importance of cloud computing in the field of the media and entertainment (M & E). It also discussed the challenges that are faced by M & E companies. It also listed a few media companies who have shifted towards the cloud. Finally, it has included the future related to cloud computing.
Cloud computing is well-known for delivering hosted services over the Internet. These services are mainly divided into three, such as Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). Cloud computing uses the computer system resources, especially storage, and computing power, and is available on-demand without direct active management by the user. The term is generally used to describe data centers available to many users over the Internet. Large clouds are predominant today, and data de-duplication is one of the strategies used to solve the repetition of information. This paper proposes preventing the replication of documents and media records like pictures using the TinEye algorithm. The deduplication strategies are commonly used in the cloud server to decrease the server's gap. To save the unauthorized use of statistics and gaining access to and creating reproduction information on the cloud, the encryption approach is to encrypt the facts before being stored on a cloud server. In this proposed system CloudMe is used for cloud storage. All files are encrypted using the AES algorithm and saved in the real cloud. A secure system architecture layout provides secure deduplication. The design permits the cloud with the critical de duplication functionality to completely cast off the bigger storage and bandwidth price. Thus, the proposed system involves image files in opposition to deduplication strategies to eliminate more storage and bandwidth value.
Fast peer-to-peer real-time data transfer can be solved in two ways. One way is to use GOOSE which has very low latency. GOOSE is a relatively mature technology used in substation automation systems, but its configuration is based on Media Access Control (MAC) addresses, resulting in higher configuration requirements for project implementation; and its communication interaction coverage is limited to the same local area network, which affects the workload of distributed control applications. Another way is to use the User Datagram Protocol (UDP), which is widely used for fast data transmission over an Ethernet network. Theoretical analysis and test results show that the transmission delay of UDP mode is not larger than that of SMS transmission if the network traffic is less than 5 Mbps, which can fully satisfy the time delay requirements of distributed control applications. The priority of UDP packets is the same as the priority of communication data (mostly in TCP transmission mode) from the master station. If the network delay is highly dependent on the network load, UDP packets may be lost under heavy network traffic conditions, and therefore, fast peer-to-peer real-time data transmission between STUs cannot be guaranteed.
Cloud computing shares data and processing for organizations from remote locations. Though authenticated systems are in place, hackers access secured data and delete or modify it. All current software-based verification can be processed with one-bit-return protocol: the delete sequence achieves data destruction causing revenue loss or system catastrophe. In certain cases, a deleted record in the cloud, could be salvaged by a hacker. This is especially challenging when the deletion database is compressed within a Trusted Platform Module (TPM) and the user does not have access to the private code. This paper, discusses how to clear hidden data using public authentication. The key indication in this proposal is based on a “trust-but-verify” pattern, which is usually relevant to many security glitches but was fundamentally despicable in the area of deleting protected data. The objective is that the deleted record must be purged permanently from the database. Finally, some evidence to determine its feasibility of SSA has also been discussed.
This study explores and discusses upcoming encounters with new technologies in line with big data. Power utilities collect large amounts of data. However, due to their large size and the ambiguity associated with it, they are rarely used. Condition monitoring of assets collects huge amounts of data during routine operations. The question "How to get information from a huge amount of data?" and the notion of "data-rich and information devoid" has faced significant resistance from analytics experts with the advent of support vector machines. Along with new technologies such as the Internet of Things (IoT), big data analytics will be actively used in power utilities. This study assesses the issues and points out ways to address them through paths and strategies to make asset management practices smarter for future generations.