The process by which we try to reconstruct or regenerate a voice sample from a source sample or try to modify a source voice to a desirable voice, we call it as Synthetic voice generation or artificial voice or voice conversion. The basic and conventional remedy to this issue are based on training and applying conversion functions which generally require a suitable amount of pre-stored training data from both the source and the target speaker. The paper deals with a very crucial issue of achieving the required prosody, timber and some other unique voice templates by considerably reducing the dependence on the sample training dataset of voice. We needed to find out a way by which we can have templates of the “to be achieved voice” which are nearly same parametrically. This is achieved by assigning a marker to the target voice sample for training .A proper estimation of the transformation function can be made possible only by the above mentioned data. We can get the process done by pre existing methods. In nut shell what we proposed is that a system by which in the scarce availability of training dataset also we can reach to a considerable amount of closeness of the target voice. Even though there is a disadvantage that to have higher precision and closer resemblance we need to have clear idea of the system of spelling that a language uses.
A Content Distribution Network (CDN) is a collection of large number of server deployed in different area across the internet. CDN serves end-users with high availability of content and high performance. It mainly supports application like ecommerce, live digital streaming media, on demand streaming media, etc. In such application end-users involve with very low data rate, varying communication overhead, low tolerance for high latency. The existing work uses a single signing/verification operation by chaining technique for signing/verifying multiple packets. The signature is generated using Feige Fiat Shamir signature scheme. In proposed work a security mechanism known as Trapdoor hash based signature amortization technique is implemented. This authenticates individual data blocks in a stream and the signature is generated using DL-SA signature scheme. The proposed technique provide high tolerance for loss of intermediate blocks, higher signing and verification rate, limited communication overhead.
Clustering is collection of data objects that are similar to one another and thus can be treated collectively as one group. The model based clustering approach uses model for clustering and optimizes the fit between the data and model. The evolutionary algorithm has the ability to thoroughly search the parameter space, providing an approach inherently more robust with respect to local maxima. In EvolvExpectation-Maximization(EvolvEM)algorithm,Expectation Maximization and Genetic algorithm is used for clustering data which shows more efficiency then EM clustering. The drawback in this method is that its execution time is higher and it requires more parameters. In the proposed approach, instead of Genetic algorithm, Bee colony optimization can be combined with Expectation Maximization algorithm in order to improve execution time and clustering efficiency. Hence, it can be efficiently used for clustering.
Grid computing offers the network with large scale computing resources. Load balancing is effective for balancing the load of large scale heterogeneous grid resources that are typically owned by different organizations. Not all the techniques provide the same benefits for users in utilizing the resources in a quick response time. Similarly, the profit earned by resource providers also differs for different Load balancing technique. We surveyed the load balancing and job migration technique used in grid computing since its inception until 2013. We discuss their advantages and disadvantages and analyze their suitability for usage in a dynamic grid environment. To the best of our knowledge, no such survey has been conducted in the literature up to now. A comparative study of some of them along with their pitfalls in case of huge distributed environment, like Grid, is discussed in this paper. We also proposed efficient hierarchal Load Balancing algorithm to close all the existing gaps
The expansion of the Internet has made web applications a part of everyday life. The numbers of incidents which exploit web application vulnerabilities are increasing day by day. Due to the growth of networks and internet, many offline services have been changed to online services. Nowadays, most online services consists of web services. The ability to access the web from any place at anytime is a great advantage.However, as the popularity of the web increases, attacks on the web increases. Most of the attacks made on the web targets the vulnerabilities of web applications. This paper surveys the most popular existing XSS related issues, their Detection/Prevention techniques and tools proposed in last decade. These vulnerabilities are researched and analyzed at [1]OWASP(Open Web Application Security project).It tracks the most common failures on websites.Cross-Site Scripting(XSS) attacks are a type of injection problem in which malicious scripts are injected into the trusted web sites.