Virus Hepatitis detection by Artifical Bee Colony Optimization and Tree Classifier

$49

Description

Viral hepatitis is liver inflammation due to a viral infection. In 2013 about 1.5 million people died from viral hepatitis. Most deaths are due to hepatitis B and hepatitis C. East Asia is the region of the world most affected.

In this MATLAB Code, many models are presented for hepatitis virus detection

In the first models, just one classifier is used (SVM, decision tree). 

In the second models. The artificial algorithm is combined with classifiers for feature selection. (binary genetic algorithm feature selection by SVM, binary genetic algorithm feature selection decision tree, SVM ABC feature selection, decision tree ABC feature selection). 

Then, filter feature selection and artificial algorithm is combined for better feature selection. (GA SVM Gini , GA SVM Kruskal-Wallis, GA SVM Fished, GA Decision Tree Gini , GA Decision Tree Kruskal-Wallis, GA Decision Tree Fished , BCO SVM Gini , BCO SVM Kruskal-Wallis, BCO SVM Fished, BCO Decision Tree Gini , BCO Decision Tree Kruskal-Wallis, BCO Decision Tree Fished)
Filter feature selection which is used are : Gini , Fished , Kruskal-Wallis.

Input data are UCI repository (https://archive.ics.uci.edu/ml/datasets/Hepatitis).
Best performance was for BCO Decision Tree Fished.

The output of Code are :

performance

Accuracy

FNR

FPR

Precision

Recall

F_measure

references :

[1] Karaboga، Dervis، and Bahriye Basturk. “A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm.” Journal of global optimization 39.3 (2007): 459-471.

[2] Akay، Bahriye، and Dervis Karaboga. “A survey on the applications of an artificial bee colony in signal، image، and video processing.” Signal، Image and Video Processing 9.4 (2015): 967-990.

[3] Suykens، Johan AK، and Joos Vandewalle. “Least squares support vector machine classifiers.” Neural processing letters 9.3 (1999): 293-300.

[4] Furey، Terrence S.، et al. “Support vector machine classification and validation of cancer tissue samples using microarray expression data.” Bioinformatics16.10 (2000): 906-914.

[5] Yadav، Sanjeev Kumar، Amit Bhola، and Arvind Kumar Tiwari. “Classification of enzyme functional classes and subclasses using support vector machine.”Futuristic Trends on Computational Analysis and Knowledge anagement (ABLAZE)، 2015 International Conference on. IEEE، 2015.

[6] Bolaños, R., M. Echeverry, and J. Escobar. “A multiobjective non-dominated sorting genetic algorithm (NSGA-II) for the Multiple Traveling Salesman Problem.” Decision Science Letters 4.4 (2015): 559-568.

[7] Rastogi، Rohit، et al. “Unsupervised Classification of Mixed Data Type of Attributes Using Genetic Algorithm (Numeric، Categorical، Ordinal، Binary، Ratio-Scaled).” Proceedings of the Third International Conference on Soft Computing for Problem Solving. Springer India، 2014.

[8] Johnson, Piers, et al. “Genetic algorithm with logistic regression for prediction of progression to Alzheimer’s disease.” BMC bioinformatics 15.Suppl 16 (2014): S11.

[9] Corrado, Gini. “On the Measure of Concentration with Special Reference to Income and Wealth.” Papers Presented at the Cowles Commission Research Conference on Economics and Statistics (Colorado College Publication, 1936). 1936.

[10] Wei, L. J. “Asymptotic conservativeness and efficiency of kruskal-wallis test for k dependent samples.” Journal of the American Statistical Association76.376 (1981): 1006-1009.

[11] R.O. Duda، P.E. Hart، and D.G. Stork، “Pattern Classification” Journal of Classification Volume 24 Issue 2، September 2007، Pages 305-307.

[12] Segen, Jakub. “Feature selection and constructive inference.” Proc. Seventh Int’l Conf. Pattern Recognition. 1984.

[13] Koller, Daphne, and Mehran Sahami. “Toward optimal feature selection.” Proceedings of International Conference on Machine  learning (1996).

[14] Sheinvald, Jacob, Byron Dom, and Wayne Niblack. “A modeling approach to feature selection.” Pattern Recognition, 1990. Proceedings., 10th International Conference on. Vol. 1. IEEE, 1990.

[15] Mucciardi, Anthony N., and Earl E. Gose. “A comparison of seven techniques for choosing subsets of pattern recognition properties.” IEEE Transactions on Computers 9 (1971): 1023-1031.

[16] Schlimmer, J.C., Efficiently inducing determinations: A complete and systematic search algorithm that uses optimal pruning. In: Proceedings of Tenth International Conference on Machine Learning, 284–290, (1993).

[17]. Devijver, P.A. and Kittler, J., “Pattern Recognition: A Statistical Approach”. Prentice Hall, 1982.

[18] Doak, Justin. An evaluation of feature selection methods and their application to computer security. University of California, Computer Science, 1992.

[19] KC Tan, EJ Teoh, Q Yu, and KC Goh, “A hybrid evolutionary algorithm for attribute selection in data mining,” Expert Syst. Appl., vol. 36, no. 4, pp. 8616-8630, 2009.

[20] Hongyu Guo and Herna L. Viktor, “Learning from imbalanced data sets with boosting and data generation: the DataBoost-IM approach,” ACM SIGKDD Explorations-Special issue on learning from imbalanced datasets, vol. 6, no. 1, pp. 30-39, 2004.

[21] Jiang, Zheng, et al. “Support vector machine-based feature selection for classification of liver fibrosis grade in chronic hepatitis C.” Journal of medical systems 30.5 (2006): 389-394.

[22] Awad, AbuBakr, et al. “Performance evaluation of decision tree classifiers for the prediction of response to treatment of hepatitis C patients.” Proceedings of the 8th International Conference on Pervasive Computing Technologies for Healthcare. ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering), 2014.

[23] Tan, Kay Chen, et al. “A hybrid evolutionary algorithm for attribute selection in data mining.” Expert Systems with Applications 36.4 (2009): 8616-8630.

 

Reviews

There are no reviews yet.

Be the first to review “Virus Hepatitis detection by Artifical Bee Colony Optimization and Tree Classifier”
SKU: b2017_0043 Category: Tags: , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,