การเปรียบเทียบเครื่องจักรเรียนรู้เอ็กซ์ตรีมเพื่อช่วยในการทำนายการเสียชีวิตและการเจ็บป่วยของมารดาตั้งครรภ์ที่มีความไม่สมดุลของข้อมูล

Main Article Content

กษมา ดอกดวง
ศรุติ อัศวเรืองสุข
ปิยภัทร โกษาพันธุ์

บทคัดย่อ

This paper aims to find algorithm to predict the maternal mortality and morbidity cases. Cases of maternal mortality and morbidity data from WHO global survey 2007-2008 which is imbalanced data. This imbalanced data is the subject of challenge in in data mining for researchers, called class imbalanced learning (CIL) and lead to difficulties in machine learning and reduce the efficiency of classifiers. Therefore, we have compared several algorithms to handle the imbalanced data classification problem using the synthetic minority hybrid over-sampling and over-sampling technique (SMOTE+TL) used to drive classification models of C4.5, MLP, Naïve Bayes, and the several kinds of Extreme Learning Machine (ELM) as follows: ELM, Circular ELM (CELM), q-Gaussian Extreme Learning Machine (QELM) and q-Gaussian CELM (QCELM) to classify the maternal mortality and morbidity results using Accuracy and G-mean to evaluated performance. Our results demonstrated that ELM show that the ELM algorithm outperformed the other algorithms in Accuracy but QELM outperformed other algorithm by G-mean. This may be helpful performing clinical assessments.

Article Details

บท
บทความวิจัย

References

[1] World Health Organization (WHO). 1993. Interna-tional Statistical Classification of Diseases and Related Health Problem: 10th Revision.
[2] World Health Organization (WHO). 1996. Maternal Health and Safe Motherhood Program. Revised 1990 estimates of Maternal Mortality: A New Ap-proach by WHO and UNICEF. Geneva: WHO.
[3] Walker G.J., Ashley D.E., McCaw A.M., and Bernard G.W. 1986. Maternal mortality in Jamaica. The Lancet. 327(8479), 486-488.
[4] Chawla N.V., Bowyer K.W., Hall L.O., and Kegelmey-er W.P.2002. “SMOTE: Synthetic Minority Over-sampling Technique.” Journal of Artificial Intel-ligence Research 16, 321-357 (2002).
[5] Tomek I. 1976. An experiment with the edited near-est neighbor Rule. IEEE Transaction on systems, 6, 448-452 (1976).
[6] Sangaroon S. and Wattana M. 2015. A comparison of data mining technique for risk classification model of maternal mortality and morbidity from WHO global survey 2007-2008. Paper presented at National Computer Science and Engineering Conference 2015, Chiang Mai University, Thai-land, 23-26 Nov 2015.
[7] Huang G.-B., Zhu Q.-Y., and Siew C.-K. 2006. “Ex-treme
learning machine: Theory and applications.
Neurocomputing 70(1-3), 489-501.
[8] Huang G.-B., Wang D.H., and Lan Y. 2011. “Extreme learning machines: A survey.” International Journal of Machine Learning and Cybernetics. 2(2), 107–122.
[9] Huang G.-B., Wang D.H., and Lan Y. Extreme learning machines for regression and multiclass classifica-tion. IEEE Transactions on Systems, 42(2), 513–519 (2012).
[10] Padmavati J. 2011. “A comparative study on breast cancer prediction using RBF and MLP.” International Journal of Scientific & En-gineering
Research 2(1), 1-5.
[11] Witten, I.H. and Frank, E. 2005. Data Mining: Prac-tical machine learning tools and techniques. 2nd edition. Morgan Kaufmann, San Francisco.
[12] Batista GEAPA, Prati RC, Monard MC. A study of the behavior of several methods for balancing ma-chine learning training data. ACM SIGKDD Explo-rations Newsletter [serial online] 2004; 6(1): 20-29.
[13] Quinlan, J. R. 1992. C4.5: programs for machine learning. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA.
[14] Tang, J., Deng, C., & Huang, G. B. 2015. Extreme learning machine for multilayer perceptron. IEEE transactions on neural networks and learning systems, 27(4), 809-821.
[15] John G.H. and Langley P. 1995. Estimating contin-uous distributions in Bayesian classifiers. In: the 11th conference on Uncertainty in artificial
intelligence pp. 338-345, Publishers Inc, Morgan Kaufmann.
[16] Decherchi S., Gastaldo P., Zunino R., Cambria E., and Redi J.2013. Circular-ELM for the reduced-reference assessment of perceived image quality. Neurocomputing. (102), 78–89.
[17] Ridella S., Rovetta S. and Zunino R. 1997. Circular Backpropagation Networks for Classification. IEEE Transactions on Neural Networks, 8, 81-97.
[18] Gastaldo P., Zounino R., Heynderickx I. and Vicario E. 2002. Circular BackPropagation Networks for Measuring Displayed Image Quality. In: Proceed-ing of ICANN 2002. LNCS, vol.2415, pp. 1219-1224. Springer, Heidelberg.
[19] Stosic D., Zanchettin C., Ludermir T., Stosic B.: QRNN: Q-generalized random neural net-work. IEEE Transactions on Neural Networks and Learning Systems, 1-8 (2016).
[20] Atsawaraungsuk S. 2016. q-Gaussian activation function Circular Extreme Learning Machine for classification problems. In: International con-ference on information technology and elec-trical engineering (ICITEE) 2016, pp. 125-129, IEEE, Yogyakarta, Indonesia (2016)