Feature Selection using a Novel Particle Swarm Optimization and It’s Variants

Full Text (PDF, 186KB), PP.16-24

Views: 0 Downloads: 0

Author(s)

R. Parimala 1,* R. Nallaswamy 2

1. National Institute of Technology, Tiruchirappalli

2. Department of Mathematics, Tiruchirappall

* Corresponding author.

DOI: https://doi.org/10.5815/ijitcs.2012.05.03

Received: 6 Aug. 2011 / Revised: 23 Dec. 2011 / Accepted: 13 Feb. 2012 / Published: 8 May 2012

Index Terms

Feature selection, Support Vector Machine, Particle Swarm optimization

Abstract

Feature selection has been keen area of research in classification problem. Most of the researchers mainly concentrate on statistical measures to select the feature subset. These methods do not provide a suitable solution because the search space increases with the feature size. The FS is a very popular area for applications of population-based random techniques. This paper suggests swarm optimization technique, binary particle swarm optimization technique and its variants, to select the optimal feature subset. The main task of the BPSO is the selection of the features used by the SVM in the classification of spambase data set. The results of our experiments show a very strong relation between number of features and accuracy. Comparison of the optimized results and the un-optimized results showed that the BPSO-MS method could significantly reduce the computation cost while improving the classification accuracy.

Cite This Paper

R. Parimala, R. Nallaswamy, "Feature Selection using a Novel Particle Swarm Optimization and It’s Variants", International Journal of Information Technology and Computer Science(IJITCS), vol.4, no.5, pp.16-24, 2012. DOI:10.5815/ijitcs.2012.05.03

Reference

[1]C.J.C. Burges. A tutorial on support vector machines for pattern recognition”. Data Mining and Knowledge Discovery, 2(2): 955-974, 1998.

[2]V. N. Vapnik. The nature of Statistical Learning Theory. Springer, Berlin, 1995.

[3]N. Cristianini, and J. Shawe-Taylor, “Support Vector and Kernel Methods, Intelligent Data Analysis: An Introduction Springer – Verlag”, 2003.

[4]N.Cristianini, and J. Shawe-Taylor, “An introduction to support vector machines, Cambridge, UK: Cambridge University Press”, 2004.

[5]B. Schölkopf. C.J.C. Burges, and A.J. Smola,”Advances in Kernel Methods: Support Vector Learning”, MIT Press, (Eds.), 1998.

[6]A.J. Smola and B. Scholkopf, “Learning with kernels: Support Vector Machines, regularization, optimization, and beyond”, Cambridge, MA: MIT press.

[7]R.C. Eberhart, and J. Kennedy. “A new optimizer using particle swarm theory”, Proceedings of the sixth international symposium on micro machine and human science pp. 39-43, IEEE service center, Piscataway,NJ, Nagoya, Japan, 1995.

[8]R.C. Eberhart, and Y. Shi. “Particle swarm optimization: developments, applications and resources”. Proc. congress on evolutionary computation 2001 IEEE service center, Piscataway, NJ., Seoul, Korea., 2001.

[9]Y. Shi, and R.C. Eberhart, “Parameter selection in particle swarm optimization”, volutionary Programming VII: Proc. EP 98 pp. 591-600. Springer-Verlag,, New York, 1998.

[10]M.Carvalho, and T.B. Ludermir, “Particle swarm optimization of neural network architectures and weights”, In Proc. of the 7th int. conf. on hybrid intelligent systems, (pp. 336_339), 2007.

[11]M. Meissner, M. Schmuker, and G. Schneider, “Optimized particle swarm optimization (OPSO) and its application to artificial neural network training”, BMC, Bioinformatics, 7, 125, 2006.

[12]J. Yu, L. Xi, and S. Wang, “An improved particle swarm optimization for evolving feed forward artificial neural networks”, Neural Processing Letters, 26(3), 217_231, 2007.

[13]J. Salerno. “Using the particle swarm optimization technique to train a recurrent neural model”, IEEE International Conference on Tools with Artificial Intelligence, 45_49, 1997.

[14]M. Settles, B. Rodebaugh, and T. Soule,”Comparison of genetic algorithm and particle swarm optimizer when evolving a recurrent neural network”, Lecture notes in computer science (LNCS): Vol. 2723, Proc. of the genetic and evolutionary computation conference, pp. 151_152, 2003.

[15]M.E.H. Pedersen, “Tuning and Simplifying Heuristical Optimization”,PhD Dissertation, University of Southampton, 2010.

[16]M.E.H. Pedersen, and A.J. Chipperfield, Simplifying particle swarm optimization, Applied Soft Computing 10 (2) (2010) 618–628.

[17]Karatzoglou, A., Smola, A., Hornik, K,, Zeileis, A., 2005, “kernlab – Kernel Methods.”, R package, Version 0.6-2., Available from http://cran.R-project.org.

[18]Alexandros Karatzoglou and Ingo Feinerer, Kernel-based machine learning for fast text mining in R. Computational Statistics & Data Analysis, 54(2):290-297, February 2010.

[19]C. J. van Rijsbergen., 1979,” Information Retireval”. Butterworths, London.

[20]S.-W. Lin, K.-C. Ying, S.-C. Chen, and Z.-J. Lee. Particle swarm optimization for parameter determination and feature selection of support vector machines, Expert Systems with Applications, 35:1817-1824, 2008. 12, 14, 16, 24

[21]M. A. Esseghir, G. Goncalves, and Y. Slimani. Adaptive particle swarm optimizer for feature selection, In Proceedings of the 11th International Conference on Intelligent Data Engineering and Automated Learning, IDEAL 2010, pages 226{233. Springer-Verlag, 2010. 13, 16