Fungus Classification in Peanuts from Smart Lens Imagery Using Convolutional Neural Network

Main Article Content

Kwankamon Dittakan
Jirawat Thaenthong
Sulakkana Rodkuen
Phutphisit Thungklang

Abstract

Classification of fungi in peanuts remains a critical challenge due to the microscopic nature of fungi, which requires specialized inspection methods. Without proper classification tools, there is a risk for consumers who consume contaminated peanuts, which can lead to severe health effects, particularly for those with fungal allergies. Traditional methods using microscopes or visual inspection by experts are impractical due to the bulky size of instruments, high cost, time-consuming process, and the potential for human error. This research addresses these limitations by proposing an efficient method for fungi classification in peanuts using convolutional neural network (CNN) and image processing techniques. The system utilizes a portable smart lens, an imaging device with high magnification (up to 50x), to capture detailed peanut images and paired with three CNN architectures: MobileNetv2, DenseNet121, and NASNetMobile. The experimental results demonstrated optimal performance with specific parameters for different peanut types. For ground peanuts, the system achieved 97.13% accuracy using 500x500 pixel grayscale images. Similarly, for peanut seeds, the system maintained 96.09% accuracy with 500x500 pixel RGB color images. This approach offers a practical, portable, and cost-effective solution for reliable fungal classification in peanuts.

Article Details

How to Cite
Dittakan, K. ., Thaenthong, J., Rodkuen, S. ., & Thungklang, P. (2025). Fungus Classification in Peanuts from Smart Lens Imagery Using Convolutional Neural Network. CURRENT APPLIED SCIENCE AND TECHNOLOGY, e0265905. https://doi.org/10.55003/cast.2025.265905
Section
Original Research Articles

References

Adedoja, A., Owolawi, P. A., & Mapayi, T. (2019). Deep learning based on NASNet for plant disease recognition using leave images. In 2019 international conference on advances in big data, computing and data communication systems (icABCD) (pp. 1-5). IEEE. https://doi.org/10.1109/ICABCD.2019.8851029

Aral, R. A., Keskin, Ş. R., Kaya, M., & Hacıömeroğlu, M. (2018). Classification of TrashNet dataset based on deep learning models. In 2018 IEEE international conference on big data (big data) (pp. 2058-2062). IEEE. https://doi.org/10.1109/BigData.2018.8622212

Chen, Z., He, Z., & Lu, Z.-M. (2024). DEA-Net: Single image dehazing based on detail-enhanced convolution and content-guided attention. IEEE Transactions on Image Processing, 33, 1002-1015. https://doi.org/10.1109/TIP.2024.3354108

Chulalongkorn University. (2020). CU smart lens: Affordable smartphone microscope with high quality. https://www.research.chula.ac.th/2020/07/cu-smart-lens/

Cornely, O. A., Hoenigl, M., Lass-Flörl, C., Chen, S. C.-A., Kontoyiannis, D. P., Morrissey, C. O., Thompson, G. R., III, & Mycoses Study Group Education and Research Consortium (MSG-ERC) and the European Confederation of Medical Mycology (ECMM). (2019). Defining breakthrough invasive fungal infection-position paper of the mycoses study group education and research consortium and the European Confederation of Medical Mycology. Mycoses, 62(9), 716-729. https://doi.org/10.1111/myc.12960

Dhillon, A., & Verma, G. K. (2020). Convolutional neural network: a review of models, methodologies and applications to object detection. Progress in Artificial Intelligence, 9, 85-112. https://doi.org/10.1007/s13748-019-00203-0

Gonzalez, R. C., & Woods, R. E. (2018). Digital image processing (4th Ed.). Pearson.

Habiba, S. U., & Islam, M. K. (2021). Tomato plant diseases classification using deep learning based classifier from leaves images. In 2021 international conference on information and communication technology for sustainable development (ICICT4SD) (pp. 82-86). IEEE. https://doi.org/10.1109/ICICT4SD50815.2021.9396883

Hangarge, M. (2023). Deep learning based classification of microscopic fungi for agriculture application. In Proceedings of the first international conference on advances in computer vision and artificial intelligence technologies (ACVAIT 2022) (pp. 546-560). Atlantis Press. https://doi.org/10.2991/978-94-6463-196-8_42

Howard, A. G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., & Adam, H. (2017). MobileNets: Efficient convolutional neural networks for mobile vision applications. https://arxiv.org/pdf/1704.04861

Huang, G., Liu, Z., Van Der Maaten, L., & Weinberger, K. Q. (2017). Densely connected convolutional networks. In 2017 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 2261-2269). IEEE. https://doi.org/10.1109/CVPR.2017.243

Iorga, C., & Neagoe, V.-E. (2019). A deep CNN approach with transfer learning for image recognition. In 2019 IEEE electronics, computers and artificial intelligence (ECAI) (pp. 1-6). IEEE. https://doi.org/10.1109/ECAI46879.2019.9042173

Jogloy, S. (2011). Aflatoxin in peanuts: Proposed solutions to the problem. Journal of Agricultural Science, 1(1), 1-11.

Kaur, G., Sharma, N., Chauhan, R., Pokhariya, H. S., & Gupta, R. (2023). Fruit and vegetable classification using MobileNet V2 transfer learning model. In Proceedings of the 2023 3rd International Conference on Smart Generation Computing, Communication and Networking (SMART GENCON) (pp. 1-6). IEEE. https://doi.org/10.1109/SMARTGENCON60755.2023.10442618

Kullimrat, P. (2012). Image retrieval by considering color distribution weight with Gaussian distribution for color histogram in HSV color model. Eastern Asia University Journal of Science and Technology, 6(2), 101-109.

Kumeechai, P. (2018). Traffic light detection using back propagation neural networks in Thailand. NKRAFA Journal of Science and Technology, 13, 67-72.

Nair, A., Manomohan, N. K., Reddy, H., Pandey, S., & Kadam, P. (2021). Fungus detection and identification using computer vision techniques and convolution neural networks. In 2021 international conference on smart generation computing, communication and networking (SMART GENCON) (pp. 1-6). IEEE. https://doi.org/10.1109/SMART GENCON51891.2021.9645827

Nobrega, R. V. M. d., Peixoto, S. A., da Silva, S. P. P., & Filho, P. P. R. (2018). Lung nodule classification via deep transfer learning in CT lung images. 2018 IEEE 31st international symposium on computer-based medical systems (CBMS) (pp. 244-249). IEEE. https://doi.org/10.1109/CBMS.2018.00050

Prommakhot, A., & Srinonchat, J. (2020). Exploiting convolutional neural network for automatic fungus detection in microscope images. In 2020 8th International Electrical Engineering Congress (iEECON) (pp. 1-4). IEEE. https://doi.org/10.1109/iEECON48109.2020.229532

Prommakhot, A., & Srinonchat, J. (2022). Scaled dilation of dropblock optimization in convolutional neural network for fungus classification. Computers, Materials and Continua, 72(2), 3313-3329. https://doi.org/10.32604/cmc.2022.024417

Prommakhot, A. & Srinonchat, J. (2024). Combining convolutional neural networks for fungi classification. IEEE Access, 12, 58021-58030. https://doi.org/ 10.1109/ACCESS.2024.3391630

Rahman, A., Clinch, M., Reynolds, J., Dangott, B., Villegas, D. M. M., Nassar, A., Hata, D. J., & Akkus, Z. (2023). Classification of fungal genera from microscopic images using artificial intelligence. Journal of Pathology Informatics, 14, Article 100314. https://doi.org/10.1016/j.jpi.2023.100314

Russell, S. J., & Norvig, P. (2020). Artificial intelligence: A modern approach (4th Ed.). Cambridge University Press.

Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., & Chen, L.-C. (2018). MobileNetV2: Inverted residuals and linear bottlenecks. In Proceedings of the IEEE computer society conference on computer vision and pattern recognition (pp. 4510-4520). IEEE. https://doi.org/10.1109/CVPR.2018.00474

Sardogan, M., Tuncer, A., & Ozen, Y. (2018). Plant leaf disease detection and classification based on CNN with LVQ algorithm. In 2018 3rd international conference on computer science and engineering (UBMK) (pp. 382-385). IEEE. https://doi.org/10.1109/UBMK.2018.8566635

Saxen, F., Werner, P., Handrich, S., Othman, E., Dinges, L., & Al-Hamadi, A. (2019). Face attribute detection with MobileNetV2 and NasNet-Mobile. In 2019 11th international symposium on image and signal processing and analysis (ISPA) (pp. 176-180). IEEE. https://doi.org/10.1109/ISPA.2019.8868585

Sooksatra, S., Yoshitaka, A., Kondo, T., & Bunnun, P. (2019). The density-aware estimation network for vehicle counting in traffic surveillance system. In 2019 15th international conference on signal-image technology and internet-based systems (SITIS) (pp. 231-238). IEEE. https://doi.org/10.1109/SITIS.2019.00047

Tahir, M. W., Zaidi, N. A., Blank, R., Vinayaka, P. P., & Lang, W. (2016). Fungus detection system. In 2016 IEEE international conference on autonomic computing (ICAC) (pp. 227-228). IEEE. https://doi.org/10.1109/ICAC.2016.50

Tahir, M. W., Zaidi, N. A., Rao, A. A., Blank, R., Vellekoop, M. J., & Lang, W. (2018). A fungus spores dataset and a convolutional neural network based approach for fungus detection. IEEE Transactions on Nanobioscience, 17(3), 281-290. https://doi.org/10.1109/TNB.2018.2839585

Thitipechakul, S., Somyultrap, K., Sastarin, K., Bunyapraphaphan, P., & Meka, N. (2015). Fungal and aflatoxin contamination in ready-to-eat peanut products. Journal of Food Science and Technology, 52(4), 244-253.

Yan, Y., Bout, B., Berthelier, A., Naturel, X., & Chateau, T. (2018). Face parsing for mobile AR applications. In 2018 IEEE international symposium on mixed and augmented reality adjunct (ISMAR-Adjunct) (pp. 407-408). IEEE. https://doi.org/10.1109/ISMAR-Adjunct.2018.00119

Yang, M., Yu, K., Zhang, C., Li, Z., & Yang, K. (2018). DenseASPP for semantic segmentation in street scenes. In 2018 IEEE/CVF conference on computer vision and pattern recognition (pp. 3684-3692). IEEE. https://doi.org/10.1109/CVPR.2018.00388

Zhang, Y., & Davison, B. D. (2020). Impact of ImageNet model selection on domain adaptation. In Proceedings of the IEEE/CVF winter conference on applications of computer vision workshops (pp. 173-182). IEEE. https://doi.org/10.48550/arXiv.2002.02559