Adjustive Reciprocal Whale Optimization Algorithm for Wrapper Attribute Selection and Classification

Full Text (PDF, 740KB), PP.18-26

Views: 0 Downloads: 0

Author(s)

Heba F. Eid 1,* Azah Kamilah Muda 2

1. Al-Azhar University, Faculty of Science, Cairo, Egypt

2. Faculty of Information and Communication Technology (FTMK), Universiti Teknikal Malaysia Melaka, Malaysia

* Corresponding author.

DOI: https://doi.org/10.5815/ijigsp.2019.03.03

Received: 29 Oct. 2018 / Revised: 12 Dec. 2018 / Accepted: 17 Jan. 2019 / Published: 8 Mar. 2019

Index Terms

Bio-inspired algorithm, Whale Optimization, Recipro¬cal spiral, Information Gain, Attribute selection, Classification

Abstract

One of the most difficult challenges in machine learning is the data attribute selection process. The main disadvantages of the classical optimization algorithms based attribute selection are local optima stagnation and slow convergence speed. This makes bio¬-inspired optimization algorithm a reliable alternative to alleviate these drawbacks. Whale optimization algorithm (WOA) is a recent bio-inspired algorithm, which is competitive to other swarm based algorithms. In this paper, a modified WOA algorithm is proposed to enhance the basic WOA performance. Furthermore, a wrapper attribute selection algorithm is proposed by integrating information gain as a preprocessing initialization phase. Experimental results based on twenty mathematical optimization functions demonstrate the stability and effectiveness of the modified WOA when compared to the basic WOA and the other three well-known algorithms. In addition, experimental results on nine UCI datasets show the ability of the novel wrapper attribute selection algorithm in selecting the most informative attributes for classification tasks.

Cite This Paper

Heba F. Eid, Azah Kamilah Muda, " Adjustive Reciprocal Whale Optimization Algorithm for Wrapper Attribute Selection and Classification", International Journal of Image, Graphics and Signal Processing(IJIGSP), Vol.11, No.3, pp. 18-26, 2019. DOI: 10.5815/ijigsp.2019.03.03

Reference

[1]Xue, B., Zhang, M., Browne, W., Yao, X., “A survey on evolutionary computation approaches to feature selec¬tion.”, IEEE Transactions on Evolutionary Computation , vol.20, pp. 606-626 , 2016. 

[2]Yu, L., Liu, H.m “Feature selection for high-dimensional data: a fast correlation-based filter solution.” In Proc. of the twentieth International Conference on Machine Learning, pp. 856–863 , 2003.

[3]Kim, Y., Street, W., Menczer, F.,” Feature selection for unsupervised learning via evolutionary search.”, In Proc. of the Sixth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining,  pp. 365–369 , 2000.

[4]Crepinek, M., Liu, S.H., Mernik, M., “Exploration and exploitation in evolutionary algorithms: a survey.” ACM Comput Surv (CSUR) , vol. 45, pp. 35–42, 2013. 

[5]Gang, X., “ An adaptive parameter tuning of particle swarm optimization algorithm. Appl Math Comput, vol. 219,  pp.4560-4569 , 2013. 

[6]Zhang, Y., Wang, S., Ji, G.,”A comprehensive survey on particle swarm optimization algorithm and its appli¬cations.”, Math Probl Eng doi:10.1155/2015/931256 , 2015. 

[7]Nesmachnow, S., “An overview of metaheuristics: accu¬rate and efficient methods for optimisation.”, International Journal of Metaheuristics, vol. 3, pp. 320–347, 2014. 

[8]Dorigo, M., Birattari, M.,“Ant colony optimization.” En-cyclopedia of Machine Learning, Springer , 2010. 

[9]Miller, P., “The Smart Swarm: How Understanding Flocks, Schools, and Colonies Can Make Us Better at Communicating, Decision Making, and Getting Things done.”, Avery Publishing Group, Inc. , 2010. 

[10]Yang, X.S.,”Bat algorithm for multi-objective optimisa-tion.”, Int. J. Bio-Inspired Comput. Vol. 3, pp. 267–274 , 2011.

[11]Pham, D., Ghanbarzadeh, A., Koc, E., Otri, S., Rahim, S., Zaidi, M., ”The bees algorithm. Manufacturing Engineering Centre, Cardiff University, 2005. 

[12]Eberhart, R., Kennedy, J.,” A new optimizer using parti¬cle swarm theory.” In Proc. of the Sixth International Symposium on Micro Machine and Human Science, pp. 39–43. Nagoya, Japan ,1995. 

[13]Gandomi, A.H., Yang, X., Talatahari, S., Alavi, A.H.,”Metaheuristic Algorithms in Modeling and Optimiza¬tion.”, Metaheuristic Application in Structures and Infras¬tructures , 2013. 

[14]Yang, X.S., “ Random walks and optimization. In: Nature-inspired optimization algorithms, first edition edn. Elsevier, Oxford,  2014. 

[15]Mirjalili, S., Lewis, A. “ The whale optimization algo¬rithm. Adv Eng Softw”, vol. 95, pp. 51–67, 2016. 

[16]Heba F. Eid , “Binary whale optimization: an effective swarm algorithm for feature selection” International Journal of  Metaheuristics, Vol 7, pp.67–79, 2018.

[17]Hu, H., Bai, Y., Xu, T.,”Improved whale optimization algorithms based on inertia weights and theirs applica-tions.”, International Journal of Circuits, Systems and Signal Processing , vol. 11, pp. 12–26, 2017. 

[18]Ling, Y., Zhou, Y., Luo, Q., “ Levy flight trajectory-based whale optimization algorithm for global optimization.”, IEEE Access pp. 6168 – 6186, 2017. 

[19]Mafarja, M.M., Mirjalili, S.,”Hybrid whale optimization algorithm with simulated annealing for feature selection.” Neurocomputing, vol. 260, pp. 302–312 , 2017.

[20]Rashedi, E., Nezamabadi-Pour, H., Saryazdi, S., “Gsa: a gravitational search algorithm.”, Information Science pp. 2232-2248 , 2009. 

[21]Storn, R., Price, K..”Differential evolutiona simple and efficient heuristic for global optimization over continu¬ous spaces”., J. Glob Optim,  pp. 341-359, 1997. 

[22]Mirjalili, S., Mirjalili, S., Lewis, A., “Grey wolf opti¬mizer.” Advances in Engineering Software, vol. 69, pp. 46-61, 2014.

[23]Frank, A., Asuncion, A., Uci machine learning reposi¬tory , 2010.