AUTOMATIC FISH IDENTIFICATION USING SINGLE SHOT DETECTOR
Keywords:Fish Detection, SSD, Bengkulu, Deep Learning, Sorting Machine
Fish identification is a way of classifying fish based on special characteristics, either through a description of the shape, body pattern of the fish, color or other characteristics. To be good handlers, the ability to identify the genus of marine fish through the help of a computer is needed so that identification can be done automatically. The deep learning method is an analytical method used to analyze large and more in-depth data. Deep learning has had the best results in the last 10 years. This study uses one of the models of deep learning, namely Single Shot Detector , a relatively simple algorithm to detect an object with the help of a mobilenet architecture. This study identified 10 genera of marine fish with a total dataset of 1000 images, with 90% training data and 10% validation data, each fish genus has 100 images with different shooting angles and backgrounds. The results showed that the Single Shot Detector model with Mobilenet architecture got an accuracy value of 52.48% for the identification of 10 genera of marine fish. Keywords: Marine Fish Identification, Single Shot Detector, Mobilenet.
A. Salman et al., “Automatic fish detection in underwater videos by a deep neural network-based hybrid motion learning system,” ICES J. Mar. Sci., vol. 77, no. 4, 2020, doi: 10.1093/icesjms/fsz025.
S. Cui, Y. Zhou, Y. Wang, and L. Zhai, “Fish Detection Using Deep Learning,” Appl. Comput. Intell. Soft Comput., vol. 2020, 2020, doi: 10.1155/2020/3738108.
C. Shi, C. Jia, and Z. Chen, “FFDet: A Fully Convolutional Network for Coral Reef Fish Detection by Layer Fusion,” in VCIP 2018 - IEEE International Conference on Visual Communications and Image Processing, 2018, doi: 10.1109/VCIP.2018.8698738.
J. Hu, G.-S. Xia, F. Hu, and L. Zhang, “A Comparative Study of Sampling Analysis in the Scene Classification of Optical High-Spatial Resolution Remote Sensing Imagery,” Remote Sens, vol. 7, pp. 14988–15013, 2015, doi: 10.3390/rs71114988.
F. et al. Azadivar and F. et al. Azadivar, “A decision support system for fisheries management using operations research and systems science approach,” Expert Syst. Appl., vol. 36, no. 2, p. 8, 2009, doi: DOI 10.1016/j.eswa.2008.01.080.
M. Hammad Saleem, S. Khanchi, J. Potgieter, and K. Mahmood Arif, “Image-based plant disease identification by deep learning meta-architectures,” Plants, vol. 9, no. 11, 2020, doi: 10.3390/plants9111451.
B. Qian et al., “Dynamic Multi-Scale Convolutional Neural Network for Time Series Classification,” IEEE Access, vol. 8, 2020, doi: 10.1109/ACCESS.2020.3002095.
J. Salamon and J. P. Bello, “Deep Convolutional Neural Networks and Data Augmentation for Environmental Sound Classification,” IEEE Signal Process. Lett., vol. 24, no. 3, 2017, doi: 10.1109/LSP.2017.2657381.
A. R. Singkam, A. Yani P, and A. Fajri, “Keragaman Ikan Laut Dangkal Provinsi Bengkulu,” J. Enggano, vol. 5, no. November, 2020.
G. Cheng, J. Han, and X. Lu, “Remote Sensing Image Scene Classification: Benchmark and State of the Art,” Proc. IEEE, vol. 105, no. 10, pp. 1865–1883, 2017, doi: 10.1109/JPROC.2017.2675998.
A. Jalal, A. Salman, A. Mian, M. Shortis, and F. Shafait, “Fish detection and species classification in underwater environments using deep learning with temporal information,” Ecol. Inform., vol. 57, 2020, doi: 10.1016/j.ecoinf.2020.101088.
G. Chandan, A. Jain, H. Jain, and Mohana, “Real Time Object Detection and Tracking Using Deep Learning and OpenCV,” in Proceedings of the International Conference on Inventive Research in Computing Applications, ICIRCA 2018, 2018, doi: 10.1109/ICIRCA.2018.8597266.
Y. Wageeh et al., “YOLO fish detection with Euclidean tracking in fish farms,” J. Ambient Intell. Humaniz. Comput., vol. 12, no. 1, 2021, doi: 10.1007/s12652-020-02847-6.
D. Diamanta and H. Toba, “Pendeteksian Citra Pengunjung Menggunakan Single Shot Detector untuk Analisis dan Prediksi Seasonality,” J. Tek. Inform. dan Sist. Inf., vol. 7, no. 1, 2021, doi: 10.28932/jutisi.v7i1.3329.
D. Jiang, B. Sun, S. Su, Z. Zuo, P. Wu, and X. Tan, “Fassd: A feature fusion and spatial attention-based single shot detector for small object detection,” Electron., vol. 9, no. 9, 2020, doi: 10.3390/electronics9091536.
W. Shi, S. Bao, and D. Tan, “FFESSD: An accurate and efficient single-shot detector for target detection,” Appl. Sci., vol. 9, no. 20, 2019, doi: 10.3390/app9204276.
L. Jin and G. Liu, “An approach on image processing of deep learning based on improved ssd,” Symmetry (Basel)., vol. 13, no. 3, 2021, doi: 10.3390/sym13030495.
K. | kementrian kelautan dan perikanan N.d., “Statistik Kementerian Kelautan dan Perikanan,” Https://Statistik.Kkp.Go.Id/, no. 16, p. 371544, 2018, [Online]. Available: https://kkp.go.id/djprl/p4k/page/4270-jumlah-pulau%0Ahttps://kkp.go.id/djpdspkp/page/2202-realisasi-investasi-sektor-kelautan-dan-perikanan%0Ahttps://kkp.go.id/djprl/artikel/21045-konservasi-perairan-sebagai-upaya-menjaga-potensi-kelautan-dan-perikanan-in.
D. Guan, H. Li, T. Inohae, W. Su, T. Nagaie, and K. Hokao, “Modeling urban land use change by the integration of cellular automaton and Markov model,” Ecol. Modell., vol. 222, no. 20–22, pp. 3761–3772, 2011, doi: 10.1016/j.ecolmodel.2011.09.009.
M. Liu, J. Shi, Z. Li, C. Li, J. Zhu, and S. Liu, “Towards Better Analysis of Deep Convolutional Neural Networks,” IEEE Trans. Vis. Comput. Graph., vol. 23, no. 1, 2017, doi: 10.1109/TVCG.2016.2598831.
K. He, X. Zhang, S. Ren, and J. Sun, “Deep residual learning for image recognition,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2016, vol. 2016-December, doi: 10.1109/CVPR.2016.90.
J. Redmon and A. Farhadi, “YOLO9000: Better, faster, stronger,” in Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017, vol. 2017-January, doi: 10.1109/CVPR.2017.690.
J. Redmon and A. Farhadi, “Yolo V2.0,” Cvpr2017, no. April, 2017.
Copyright (c) 2022 Arie Vatresia, Ruvita Faurina, Vivin Purnamasari, Indra Agustian
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Authors who publish with this journal agree to the following terms:
a. Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License - Share Alike that allows others to share the work with an acknowledgment of the work's authorship and initial publication in this journal.
b. Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgment of its initial publication in this journal.
c. Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work.
All articles published Open Access will be immediately and permanently free for everyone to read and download. We are continuously working with our author communities to select the best choice of license options, currently being defined for this journal as follows: Creative Commons Attribution-Share Alike (CC BY-SA)