Feature Selection based on Hybrid Binary Cuckoo Search and Rough Set Theory in Classification for Nominal Datasets

Full Text (PDF, 652KB), PP.63-72

Views: 0 Downloads: 0

Author(s)

Ahmed F. Alia 1,* Adel Taweel 2

1. An- Najah national University, Palestine, (Master of Computing, Birzeit University, Palestine)

2. Department of Computer Science, Birzeit University, Palestine

* Corresponding author.

DOI: https://doi.org/10.5815/ijitcs.2017.04.08

Received: 20 Apr. 2016 / Revised: 23 Aug. 2016 / Accepted: 18 Nov. 2016 / Published: 8 Apr. 2017

Index Terms

Feature Selection, Rough Set Theory, Cuckoo Search, Binary Cuckoo Search, Nature Inspired Algorithms, Meta-heuristic Algorithms

Abstract

Feature Selection (FS) is an important process to find the minimal subset of features from the original data by removing the redundant and irrelevant features. It aims to improve the efficiency of classification algorithms. Rough set theory (RST) is one of the effective approaches to feature selection, but it uses complete search to search for all subsets of features and dependency to evaluate these subsets. However, the complete search is expensive and may not be feasible for large data due to its high cost. Therefore, meta-heuristics algorithms, especially Nature Inspired Algorithms, have been widely used to replace the reduction part in RST. This paper develops a new algorithm for Feature Selection based on hybrid Binary Cuckoo Search and rough set theory for classification on nominal datasets. The developed algorithm is evaluated on five nominal datasets from the UCI repository, against a number of similar NIAs algorithms. The results show that our algorithm achieves better FS compared to two known NIAs in a lesser number of iterations, without significantly reducing the classification accuracy.

Cite This Paper

Ahmed F. Alia, Adel Taweel, "Feature Selection based on Hybrid Binary Cuckoo Search and Rough Set Theory in Classification for Nominal Datasets", International Journal of Information Technology and Computer Science(IJITCS), Vol.9, No.4, pp.63-72, 2017. DOI:10.5815/ijitcs.2017.04.08

Reference

[1]H. Liu and H. Motoda, Feature selection for knowledge discovery and data mining, vol. 454, Springer Science & Business Media, 2012.

[2]I. A. Gheyas and L. S. Smith, "Feature subset selection in large dimensionality domains," Pattern recognition, vol. 43, no. 1, pp. 5-13, 2010.

[3]D. Koller and M. Sahami, "Toward optimal feature selection," 1996. 

[4]N. Kwak and C.-H. Choi, "Input feature selection for classification problems," IEEE Transactions on Neural Networks, vol. 13, no. 1, pp. 143-159, 2002. 

[5]M. Dash and H. Liu, "Feature selection for classification," Intelligent data analysis, vol. 1, no. 3, pp. 131-156, 1997. 

[6]O. Maimon and L. Rokach, "Introduction to knowledge discovery and data mining," in Data mining and knowledge discovery handbook, Springer, 2009, pp. 1-15.

[7]L. Yu and H. Liu, "Efficient feature selection via analysis of relevance and redundancy," Journal of machine learning research, vol. 5, no. Oct, pp. 1205-1224, 2004. 

[8]P. A. Estvez, M. Tesmer, C. A. Perez and J. M. Zurada, "Normalized mutual information feature selection," IEEE Transactions on Neural Networks, vol. 20, no. 2, pp. 189-201, 2009. 

[9]Z. Pawlak, "Rough sets," International Journal of Computer & Information Sciences, vol. 11, no. 5, pp. 341-356, 1982. 

[10]R. Jensen and Q. Shen, "Semantics-preserving dimensionality reduction: rough and fuzzy-rough-based approaches," IEEE Transactions on knowledge and data engineering, vol. 16, no. 12, pp. 1457-1471, 2004. 

[11]Z. Pawlak, Rough sets: Theoretical aspects of reasoning about data, vol. 9, Springer Science & Business Media, 2012. 

[12]H. Zhao, A. P. Sinha and W. Ge, "Effects of feature construction on classification performance: An empirical study in bank failure prediction," Expert Systems with Applications, vol. 36, no. 2, pp. 2633-2644, 2009. 

[13]I. Fister Jr, X.-S. Yang, I. Fister, J. Brest and D. Fister, "A brief review of nature-inspired algorithms for optimization," arXiv preprint arXiv:1307.4186, 2013. 

[14]Z. Beheshti and S. Shamsudding, "A review of population-based meta-heuristic algorithms," Int. J. Adv. Soft Comput. Appl, vol. 5, no. 1, pp. 1-35, 2013. 

[15]M. Dorigo, V. Maniezzo and A. Colorni, "Ant system: optimization by a colony of cooperating agents," IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), vol. 26, no. 1, pp. 29-41, 1996. 

[16]R. Poli, J. Kennedy and T. Blackwell, "Particle swarm optimization," Swarm intelligence, vol. 1, no. 1, pp. 33-57, 2007. 

[17]D. Goldberg, "Genetic algorithms in search, optimization, and machine learning. Reading, MA: Addison-Wesley; 1989," Objective Function Value [Eq.(18)]. 

[18]B. Basturk and D. Karaboga, "An artificial bee colony (ABC) algorithm for numeric function optimization," in IEEE swarm intelligence symposium, 2006. 

[19]D. Rodrigues, L. A. Pereira, T. Almeida, J. P. Papa, A. Souza, C. C. Ramos and X.-S. Yang, "BCS: A Binary Cuckoo Search algorithm for feature selection," in 2013 IEEE International Symposium on Circuits and Systems (ISCAS2013), 2013. 

[20]X.-S. Yang and S. Deb, "Cuckoo search via L{\'e}vy flights," in Nature & Biologically Inspired Computing, 2009. NaBIC 2009. World Congress on, 2009. 

[21]S. Kamat and A. Karegowda, "A brief survey on cuckoo search applications," Int. J. Innovative Res. Comput. Commun. Eng, vol. 2, no. 2, 2014. 

[22]L. Ke, Z. Feng and Z. Ren, "An efficient ant colony optimization approach to attribute reduction in rough set theory," Pattern Recognition Letters, vol. 29, no. 9, pp. 1351-1357, 2008. 

[23]R. Jensen and Q. Shen, "Finding rough set reducts with ant colony optimization," in Proceedings of the 2003 UK workshop on computational intelligence, 2003. 

[24]M. Mafarja and D. Eleyan, "Ant colony optimization based feature selection in rough set theory," International Journal of Computer Science and Electronics Engineering (IJCSEE) Volume, vol. 1, no. 2, 2013. 

[25]X. Wang, J. Yang, X. Teng, W. Xia and R. Jensen, "Feature selection based on rough sets and particle swarm optimization," Pattern Recognition Letters, vol. 28, no. 4, pp. 459-471, 2007. 

[26]H. Shen, S. Yang and J. Liu, "An attribute reduction of rough set based on PSO," in International Conference on Rough Sets and Knowledge Technology, 2010. 

[27]H. H. Inbarani, A. T. Azar and G. Jothi, "Supervised hybrid feature selection based on PSO and rough sets for medical diagnosis," Computer methods and programs in biomedicine, vol. 113, no. 1, pp. 175-185, 2014. 

[28]Y. Hu, L. Ding, D. Xie and S. Wang, "A Novel Discrete Artificial Bee Colony Algorithm for Rough Set-based Feature Selection.," International Journal of Advancements in Computing Technology, vol. 4, no. 6, 2012. 

[29]N. Suguna and K. Thanushkodi, "A novel rough set reduct algorithm for medical domain based on bee colony optimization," arXiv preprint arXiv:1006.4540, 2010. 

[30]N. Suguna and K. G. Thanushkodi, "An independent rough set approach hybrid with artificial bee colony algorithm for dimensionality reduction," American Journal of Applied Sciences, vol. 8, no. 3, p. 261, 2011. 

[31]M. Moghadasian and S. P. Hosseini, "Binary Cuckoo Optimization Algorithm for Feature Selection in High-Dimensional Datasets," in International Conference on Innovative Engineering Technologies (ICIET’2014), 2014. 

[32]M. Sudha and S. Selvarajan, "Feature Selection Based on Enhanced Cuckoo Search for Breast Cancer Classification in Mammogram Image," Circuits and Systems, vol. 7, no. 04, p. 327, 2016. 

[33]Z. Pawlak, "Some issues on rough sets," in Transactions on Rough Sets I, Springer, 2004, pp. 1-58.

[34]S. Rissino and G. Lambert-Torres, "Rough set theory--fundamental concepts, principals, data extraction, and applications," Data mining and knowledge discovery in real life applications, p. 438, 2009. 

[35]X.-S. Yang and S. Deb, "Engineering optimisation by cuckoo search," International Journal of Mathematical Modelling and Numerical Optimisation, vol. 1, no. 4, pp. 330-343, 2010. 

[36]J. Jona and N. Nagaveni, "Ant-cuckoo colony optimization for feature selection in digital mammogram," Pakistan Journal of Biological Sciences, vol. 17, no. 2, p. 266, 2014. 

[37]C. Blake and C. J. Merz, "Repository of machine learning databases," 1998. 

[38]M. A. Hall, "Correlation-based feature selection for machine learning," 1999.

[39]A. Moraglio, C. Di Chio and R. Poli, "Geometric particle swarm optimisation," in European conference on genetic programming, 2007. 

[40]S. L. Salzberg, "C4. 5: Programs for machine learning by j. ross quinlan. morgan kaufmann publishers, inc., 1993," Machine Learning, vol. 16, no. 3, pp. 235-240, 1994. 

[41]G. H. John and P. Langley, "Estimating continuous distributions in Bayesian classifiers," in Proceedings of the Eleventh conference on Uncertainty in artificial intelligence, 1995. 

[42]X. Wu, V. Kumar, J. R. Quinlan, J. Ghosh, Q. Yang, H. Motoda, G. J. McLachlan, A. Ng, B. Liu, S. Y. Philip and others, "Top 10 algorithms in data mining," Knowledge and information systems, vol. 14, no. 1, pp. 1-37, 2008. 

[43]G. Holmes, A. Donkin and I. H. Witten, "Weka: A machine learning workbench," in Intelligent Information Systems, 1994. Proceedings of the 1994 Second Australian and New Zealand Conference on, 1994. 

[44]K. K. Dobbin and R. M. Simon, "Optimally splitting cases for training and testing high dimensional classifiers," BMC medical genomics, vol. 4, no. 1, p. 1, 2011. 

[45]J. D. Rodriguez, A. Perez and J. A. Lozano, "Sensitivity analysis of k-fold cross validation in prediction error estimation," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 3, pp. 569-575, 2010. 

[46]N. Parthalain, Q. Shen and R. Jensen, "A distance measure approach to exploring the rough set boundary region for attribute reduction," IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 3, pp. 305-317, 2010. 

[47]B. Xue, M. Zhang and W. N. Browne, "Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms," Applied Soft Computing, vol. 18, pp. 261-276, 2014. 

[48]G. Lang, Q. Li and L. Guo, "Discernibility matrix simplification with new attribute dependency functions for incomplete information systems," Knowledge and information systems, vol. 37, no. 3, pp. 611-638, 2013. 

[49]E.-G. Talbi, Metaheuristics: from design to implementation, vol. 74, John Wiley & Sons, 2009. 

[50]D. Henderson, S. H. Jacobson and A. W. Johnson, "The theory and practice of simulated annealing," in Handbook of metaheuristics, Springer, 2003, pp. 287-319.

[51]R. Ruiz, J. C. Riquelme and J. S. Aguilar-Ruiz, "Fast feature ranking algorithm," in International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, 2003. 

[52]I. Guyon and A. Elisseeff, "An introduction to variable and feature selection," Journal of machine learning research, vol. 3, no. Mar, pp. 1157-1182, 2003. 

[53]K. Kira and L. A. Rendell, "The feature selection problem: Traditional methods and a new algorithm," in AAAI, 1992. 

[54]P. Shrivastava, A. Shukla, P. Vepakomma, N. Bhansali and K. Verma, "A survey of nature-inspired algorithms for feature selection to identify parkinson's disease," Computer Methods and Programs in Biomedicine, 2016. 

[55]S. A. Medjahed, T. A. Saadi, A. Benyettou and M. Ouali, "Binary Cuckoo Search algorithm for band selection in hyperspectral image classification," IAENG International Journal of Computer Science, vol. 42, no. 3, 2015.

[56]M. Sudha and S. Selvarajan, "Feature Selection Based on Enhanced Cuckoo Search for Breast Cancer Classification in Mammogram Image," Circuits and Systems, vol. 7, no. 04, p. 327, 2016.

[57]C. Freeman, D. Kuli{\'c} and O. Basir, "An evaluation of classifier-specific filter measure performance for feature selection," Pattern Recognition, vol. 48, no. 5, pp. 1812-1826, 2015.

[58]E. Ghumare, M. Schrooten, R. Vandenberghe and P. Dupont, "Comparison of different Kalman filter approaches in deriving time varying connectivity from EEG data," in 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2015.

[59]R. Diao and Q. Shen, "Nature inspired feature selection meta-heuristics," Artificial Intelligence Review, vol. 44, no. 3, pp. 311-340, 2015.

[60]S. Goswami and A. Chakrabarti, "Feature selection: A practitioner view," International Journal of Information Technology and Computer Science (IJITCS), vol. 6, no. 11, p. 66, 2014.

[61]R. Parimala and R. Nallaswamy, "Feature selection using a novel particle swarm optimization and It’s variants," International Journal of Information Technology and Computer Science (IJITCS), vol. 4, no. 5, p. 16, 2012.

[62]X. Jian, L. Fu, H. H. Tao and W. Haiwei, "Rough reduction algorithm for reduction of metagenomic DNA digital signature," in Control and Decision Conference (CCDC), 2016 Chinese, 2016.

[63]P. R. K. Varma, V. V. Kumari and S. S. Kumar, "A novel rough set attribute reduction based on ant colony optimisation," International Journal of Intelligent Systems Technologies and Applications, vol. 14, no. 3-4, pp. 330-353, 2015.

[64]S. Chebrolu and S. G. Sanjeevi, "Attribute reduction on real-valued data in rough set theory using hybrid artificial bee colony: extended FTSBPSD algorithm," Soft Computing, pp. 1-27, 2016.

[65]A. Alia and A. Taweel, "Hybrid Nature Inspired Algorithms and Rough Set Theory in Feature Selection for Classification: A Review," International Journal of Innovative Research in Computer and Communication Engineering, vol. 3, p. 7, 2016.