Deep convolutional neural networks with transfer learning for estimation of Nile Tilapia mass
Keywords:
Computer vision, Deep learning, Mass estimation, Nile Tilapia, Transfer learningAbstract
Importance of the work: Fish mass is one of the important traits for selective breeding. Acquiring the mass of individual live fish directly using a digital balance is sensitive to vibration due to respiratory system activity by the fish being measured. This time-consuming and labor-intensive process creates adverse effects in sensitive and prone-to-stress fish.
Objectives: To develop deep convolutional neural networks (ConvNets) with transfer learning to estimate the mass of Nile Tilapia from the image of the fish.
Materials & Methods: In total, 3,832 images were captured and individually paired; mass values were used to create the dataset. The dataset was divided into three groups: training, validating and testing. Some state-of-the-art ConvNets, such as AlexNet, GoogLeNet, VGG-16, VGG-19, Inceptionv3, InceptionResNetV2, NASNetMobile and NASNetlarge, were modified to estimate the mass of the samples.
Results: The modified VGG-19 model provided the lowest values for the root mean square error (3.59 g), mean absolute error (2.27 g), mean relative error (0.05%), mean absolute percentage error (4.09%) and coefficient of determination (0.99). However, the reflection from water film in the background had a negative impact on the mass estimation. The processing time per image on the central processing unit and the graphics processing unit were 0.177 s and 0.053 s, respectively.
Main finding: The modified VGG-19 model was suitable for real-time mass estimation of Nile Tilapia for timely selection of the best broodstocks.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2022 Kasetsart Universityonline 2452-316X print 2468-1458/Copyright © 2022. This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/),
production and hosting by Kasetsart University of Research and Development Institute on behalf of Kasetsart University.