An Effective Feature Selection Technique for Mining High Dimensional Data on Bigdata

K. Bhaskar Naik*, S.P Sindhuja**
* Assistant Professor, Department of Computer Science and Engineering, Sree Vidyanikethan Engineering College, Tirupati, India.
** PG Scholar, Department of Computer Science and Engineering, Sree Vidyanikethan Engineering College, Tirupati, India.
Periodicity:November - January'2016

Abstract

In the recent years, many research innovations have come into foray in the area of big data analytics. Advanced analysis of the big data stream is bound to become a key area of data mining research as the number of applications requiring such processing increases. Big data sets are now collected in many fields eg., Finance, business, medical systems, internet and other scientific research. Data sets rapidly increase their size as they are often generated in the form of incoming stream. Feature selection has been used to lighten the processing load in inducing a data mining model, but mining a high dimensional data becomes a tough task due to its exponential growth of size. This paper aims to compare the two algorithms, namely Particle Swarm Optimization and FAST algorithm in the feature selection process. The proposed algorithm FAST is used in order to reduce the irrelevant and redundant data, while streaming high dimensional data which would further increase the analytical accuracy for a reasonable processing time.

Keywords

Feature Selection, Minimum Spanning Tree, Classification

How to Cite this Article?

Naik, K. B., and Sindhuja, S. P. (2016). An Effective Feature Selection Technique for Mining High Dimensional Data on Bigdata. i-manager’s Journal on Cloud Computing, 3(1), 18-23.

References

[1]. J. Kennedy and Eberhart, (1995). “Particle Swarm Optimization”. IEEE Transactions on International Conference on Neural Networks, Piscataway, NJ, pp. 1942-1948.
[2]. David Waha and Richard L. Bankert, (1996). “A Comparative Evaluation of Sequential Feature Selection Algorithms”. Springer-Verlag, pp. 199-206.
[3]. Anil Jain and Douglas Zongker, (1997). “Feature Selection: Evaluation, Application and Small Sample Performance”. IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.19, No. 2, pp. 153-158.
[4]. M. Dash, and H. Liu, (1997). “Feature Selection for Classification”. Elsevier, Intelligent Data Analysis, Vol. 1, pp. 131-156.
[5]. Huan Liu and Rudy Setiono, (1997). “Chi2: Feature Selection and Discretization of Numeric Attributes”. IEEE Transactions, pp. 388-391.
[6]. Yuhui Shi and Russell Eberhart, (1998). “A Modified Particle Swarm Optimizer”. IEEE Transactions, Evolutionary Computation Proceedings, pp. 69-73.
[7]. MohdSaberi Mohamad, Safaai Deris, Safie Mat Yatim, and Muhammad Razib Othman, (2004). “Feature Selection Method using Genetic Algorithm for Classification of Small and High Dimension Data”. First International Symposium on Information and Communications Technologies, pp. 7-8.
[8]. Muhammad Imran, Rathiah Hashima and Noor Elaiza AbdKhalidb, (2013). “An Overview of Particle Swarm Optimization”. Elsevier, Procedia Engineering , Vol. 53, pp. 491-496.
[9]. S Gracia Galan, R P Prado, Je Munoz Esposito, (2015). “Rules Discovery in Fuzzy Classifier System with PSO for Scheduling in Grid Computational Infrastructure”. Elsevier, Applied Soft Computing, Vol. 29, pp. 424-435.
[10]. Simon Fong, Raymond Wong, and Athanasios V. Vasilakos, (2015). “Accelerated PSO Swarm Search Feature Selection for Data Stream Mining Big Data”. IEEE Transactions on Service Computing, Vol. 9, No. 1, pp. 33- 45.
If you have access to this article please login to view the article or kindly login to purchase the article

Purchase Instant Access

Single Article

North Americas,UK,
Middle East,Europe
India Rest of world
USD EUR INR USD-ROW
Online 15 15

Options for accessing this content:
  • If you would like institutional access to this content, please recommend the title to your librarian.
    Library Recommendation Form
  • If you already have i-manager's user account: Login above and proceed to purchase the article.
  • New Users: Please register, then proceed to purchase the article.