Book Details

Hybrid Optimization Feature Selection for Predicting Student Performance

Sri Vasavi College, Erode Self-Finance Wing, 3rd February 2017. National Conference on Computer and Communication, NCCC’17. International Journal of Computer Science (IJCS) Published by SK Research Group of Companies (SKRGC)

Download this PDF format


Education plays a vital role in deciding the society and it undergoes many changes. As a result, the education related digital data is been increasing rapidly. This made data mining approaches to spot over educational data ended in Educational data mining (EDM). The regulation focuses on investigating educational data to build models for enhancing learning experiences and improving institutional effectiveness. In this paper, the data mining techniques is used for predicting the student performance in different educational levels. Irrelevant features, along with redundant features, rigorously influence the accuracy of the classification of student performance. Therefore, feature selection should be able to detect and eliminate both irrelevant and redundant features as hard as possible. A hybrid technique of Artificial fish swarm-Cuckoo search optimization is introduced for feature or attributes selection. After feature selecting process, two effective classification techniques i.e., Prism and J48 is used for predicting the student performance. Experimentation result is shown that the feature selection method is well effective.


[1] E. Baker, International Encyclopedia of Education (3rd edition), Oxford, UK: Elsevier, (In Press).

[2] P. Mitra, C. A. Murthy and S. K. Pal. “Unsupervised feature selection using feature similarity,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 3, pp. 301–312, 2002.

[3] Miller, “Subset Selection in Regression,” Chapman & Hall/CRC (2nd Ed.), 2002.

[4] H. Almuallim and T. G. Dietterich. “Learning boolean concepts in the presence of many irrelevant features,” Artificial Intelligence, vol. 69, no. 1-2, pp. 279–305, 1994.

[5] D. Koller and M. Sahami, “Toward optimal feature selection,” In Proceedings of the Thirteenth International Conference on Machine Learning, pp. 284–292, 1996.

[6] K. R. Kohavi and G.H. John, “Wrappers for Feature Subset Selection,” Artificial Intelligence, vol. 97, nos.1-2, pp. 273-324, 1997.

[7] M.Dash and H.Liu, “Feature Selection for Classification,” An International Journal of Intelligent Data Analysis, vol. 1, no. 3, pp.131-156, 1997.

[8] W. Duch, T. Winiarski, J. Biesiada, J, and A. Kachel, “Feature Ranking, Selection and Discretization,” Int. Conf. on Artificial Neural Networks (ICANN) and Int. Conf. on Neural Information Processing (ICONIP), pp. 251 – 254, 2003.

[9] P. Langley, “Selection of Relevant Features in Machine Learning,” Proceedings of AAAI Fall Symp. Relevance, pp. 140-144, 1994.

[10] Isabella Guyon and Andre Elisseeff, “An Introduction to Variable and Feature selection,” Journal of Machine Learning Research, vol. 3, pp. 1157 – 1182, 2003.

[11] M. Ramaswami and R. Bhaskaran, “Student Performance Forecasting: A Study of Classification Models,” Intelligent Computting Models, Narosa Publishing House, New Delhi, pp. 38-45, 2009.

[12] M. K. Cope, H. H. Baker, R. Fisk, J. N. Gorby and R. W. Foster, ”Prediction of Student Performance on the Comprehensive Osteopathic Medical Examination Level Based on Admission Data and Course Performance,” Journal of the American Osteopathic Association, vol. 101, no. 2, pp. 84 – 90, 2001.

[13] N. T. Nghe, P. Janecek and P. Haddawy, “A Comparative Analysis of Techniques for Predicting Academic Performance,” Paper presented at 37th ASEE/IEEE Frontiers in Education Conference, Milwaukee, WI, October 10 – 13, 2007.

[14] W. R. Veitch, “Identifying Characterstics of High School Dropouts: Data Mining with a Decision Tree Model,” Paper Presented at Annual Metting of the American Educational Research Association, San Diego, CA, 2004 (ERIC Document No. ED490086).

[15] I.H.Witten, E.Frank, M.A. Hall “Data Mining Practical Machine Leanrning Tools & Techniques” Third edition, Pub. – Morgan kouffman. [16] Mark A. Hall, Correlation-based Feature Selection for Machine Learning, Dept of Computer Science, University of Waikato.

[17] J.Han ,M Kamber, Data mining : Concepts and Techniques. San Francisco, Morgan Kauffmann Publishers(2001).

[18] K. Kira and L. Rendell, \A practical approach to feature selection," in Proceedings of the Ninth International Conference on Machine Learning. 1992, pp. 249{256, Morgan Kaufmann.


Educational Data Mining (EDM), Feature Selection, Symmetric Uncertainty (SU) and Classification Artificial fish swarm, Cuckoo search optimization.

  • Format Volume 5, Issue 1, No 9, 2017
  • Copyright All Rights Reserved ©2017
  • Year of Publication 2017
  • Author R. Sasiregha, Dr R. Umarani
  • Reference IJCS-198
  • Page No 1210-1216

Copyright 2021 SK Research Group of Companies. All Rights Reserved.