Field Evaluation of a YOLOv8-Based Drone Video Prototype for Real-Time Tiger Detection and Early Warning

Authors

  • Rangga Rafandi Universitas Pembangunan Panca Budi, Indonesia
  • Aisyah Nabilla Universitas Pembangunan Panca Budi, Indonesia
  • Dwi Azzahra Siregar Universitas Pembangunan Panca Budi, Indonesia
  • Eko Hariyanto Universitas Pembangunan Panca Budi, Indonesia

DOI:

10.33395/sinkron.v10i2.16146

Keywords:

real-time tiger detection, drone, YOLOv8, early warning, computer vision

Abstract

Human–tiger conflict in plantation landscapes remains a critical safety and conservation issue because encounters between workers and tigers can endanger humans while increasing pressure on endangered tiger populations. This study aims to design and conduct a baseline field evaluation of a YOLOv8-based drone video prototype for real-time tiger detection and early warning. The prototype integrates drone-based RGB video acquisition, wireless video transmission, edge-based visual inference, detection logging, and warning output into a single prototype workflow. This study used a systems engineering approach and applied experimentation. The YOLOv8 model was trained using annotated tiger image data and then integrated into the prototype. Field testing was conducted in an open-field baseline scenario using six tiger replicas under two lighting conditions, daytime and evening, to support safety, ethical control, and experimental consistency. System performance was evaluated using precision, recall, F1-score, false negatives, detection range, confidence score, time-to-first-alert, and bounding box stability. The results show that the prototype performed better during daytime testing, achieving 96.90% precision, 85.62% recall, a 90.91% F1-score, a 35 m maximum detection range, 0.60–0.75 average confidence, and a time-to-first-alert of less than 1 s. In evening testing, performance decreased to 93.57% precision, 55.36% recall, 69.57% F1-score, a 7 m maximum range, 0.40–0.55 average confidence, and 1.8–2.5 s response time. These findings indicate that the prototype provides an initial technical basis for drone-based early warning, but further validation is required using real tiger data, complex plantation environments, higher occlusion levels, and improved low-light sensing before operational deployment can be claimed.

GS Cited Analysis

Downloads

Download data is not yet available.

References

Aliane, N. (2025). Drones and AI-Driven Solutions for Wildlife Monitoring. Drones, 9(7). https://doi.org/10.3390/drones9070455

AlZubi, A. A., & Alkhanifer, A. (2024). Application of Machine Learning in Drone Technology for Tracking of Tigers. Indian Journal of Animal Research, 58(9), 1614–1621. https://doi.org/10.18805/IJAR.BF-1759

Axford, D., Sohel, F., Vanderklift, M. A., & Hodgson, A. J. (2024). Collectively advancing deep learning for animal detection in drone imagery: Successes, challenges, and research gaps. Ecological Informatics, 83, 102842. https://doi.org/https://doi.org/10.1016/j.ecoinf.2024.102842

Chen, L., Li, G., Zhang, S., Mao, W., & Zhang, M. (2024). YOLO-SAG: An improved wildlife object detection algorithm based on YOLOv8n. Ecological Informatics, 83, 102791. https://doi.org/https://doi.org/10.1016/j.ecoinf.2024.102791

Figel, J. J., Safriansyah, R., Baabud, S. F., & Herman, Z. (2023). Clustered Conflicts in Disturbed Lowlands Characterize Human–tiger Interactions in Aceh, Indonesia. Wildlife Letters, 1(2), 83–91. https://doi.org/10.1002/wll2.12016

Goodrich, J., Wibisono, H., Miquelle, D., Lynam, A. J., Sanderson, E., Chapman, S., Gray, T. N. E., Chanchani, P., & Harihar, A. (2022). Panthera Tigris : The IUCN Red List of Threatened Species. https://doi.org/10.2305/IUCN.UK.2022

Hariyanto, E., Iqbal, M., Siahaan, A. P. U., Saragih, K. S., & Batubara, S. (2019). Comparative Study of Tiger Identification Using Template Matching Approach based on Edge Patterns. Journal of Physics: Conference Series, 1196(1), 012025. https://doi.org/10.1088/1742-6596/1196/1/012025

Iglay, R. B., Jones, L. R., Elmore, J. A., Evans, K. O., Samiappan, S., Pfeiffer, M. B., & Blackwell, B. F. (2024). Wildlife monitoring with drones: A survey of end users. Wildlife Society Bulletin, 48(3), e1533. https://doi.org/https://doi.org/10.1002/wsb.1533

Li, S., Li, J., Tang, H., Qian, R., & Lin, W. (2020). ATRW: A Benchmark for Amur Tiger Re-identification in the Wild. Proceedings of the 28th ACM International Conference on Multimedia, MM ’20, 2590–2598. https://doi.org/10.1145/3394171.3413569

Ma, Y., Tan, M., Liu, X., Zhang, Y., Xu, Z., Sun, W., Ge, J., & Feng, L. (2025). Deep learning for Amur tiger re-identification in camera traps: A tool assisting population monitoring and spatio-temporal analysis. Ecological Indicators, 171, 113227. https://doi.org/https://doi.org/10.1016/j.ecolind.2025.113227

Neo, W. H. Y., Lubis, M. I., & Lee, J. S. H. (2022). Settlements and Plantations Are Sites of Human–tiger Interactions in Riau, Indonesia. Oryx, 57(4), 476–480. https://doi.org/10.1017/s0030605322000667

Norouzzadeh, M. S., Nguyen, A., Kosmala, M., Swanson, A., Palmer, M. S., Packer, C., & Clune, J. (2018). Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proceedings of the National Academy of Sciences, 115(25), E5716–E5725. https://doi.org/10.1073/pnas.1719367115

Patana, P., Alikodra, H. S., Mawengkang, H., & Harahap, R. H. (2023). State of Human Tiger Conflict Around Gunung Leuser National Park in Langkat Landscape, North Sumatra, Indonesia. Biodiversitas Journal of Biological Diversity, 24(2). https://doi.org/10.13057/biodiv/d240220

Redmon, J., Divvala, S., Girshick, R., & Farhadi, A. (2016). You Only Look Once: Unified, Real-Time Object Detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 779–788. https://doi.org/10.1109/CVPR.2016.91

Roy, A. M., Bhaduri, J., Kumar, T., & Raj, K. (2022). A computer vision-based object localization model for endangered wildlife detection. Ecological Economics, Forthcoming. https://doi.org/http://dx.doi.org/10.2139/ssrn.4315295

Schneider, S., Taylor, G. W., Linquist, S., & Kremer, S. C. (2019). Past, present and future approaches using computer vision for animal re-identification from camera trap data. Methods in Ecology and Evolution, 10(4), 461–470. https://doi.org/https://doi.org/10.1111/2041-210X.13133

Shi, C., Liu, D., Cui, Y., Xie, J., Roberts, N. J., & Jiang, G. (2020). Amur tiger stripes: individual identification based on deep convolutional neural network. Integrative Zoology, 15(6), 461–470. https://doi.org/https://doi.org/10.1111/1749-4877.12453

Szeliski, R. (2022). Computer Vision: Algorithms and Applications (2nd ed.). Springer.

Wang, T., Ma, B., Zhao, X., Mou, C., & Fan, J. (2025). Pose-Guided Re-Identification of Amur Tigers Under Wild Environmental Constraints. IET Image Processing, 19(1), e70160. https://doi.org/https://doi.org/10.1049/ipr2.70160

Widiastuti, G. (2016). Model Pengembangan Desa Penyangga Berbasis Kearifan Lokal Sebagai Upaya Penurunan Frekuensi Konflik Manusia Dan Harimau Sumatera Di Taman Nasional Bukit Barisan Selatan (TNBBS). Universitas Lampung.

Willi, M., Pitman, R. T., Cardoso, A. W., Locke, C., Swanson, A., Boyer, A., Veldthuis, M., & Fortson, L. (2019). Identifying animal species in camera trap images using deep learning and citizen science. Methods in Ecology and Evolution, 10(1), 80–91. https://doi.org/https://doi.org/10.1111/2041-210X.13099

Wu, L., Jinma, Y., Wang, X., Yang, F., Xu, F., Cui, X., & Sun, Q. (2024). Amur Tiger Individual Identification Based on the Improved InceptionResNetV2. Animals, 14(16). https://doi.org/10.3390/ani14162312

Zhai, X., Huang, Z., Li, T., Liu, H., & Wang, S. (2023). YOLO-Drone: An Optimized YOLOv8 Network for Tiny UAV Object Detection. Electronics (Switzerland), 12(17). https://doi.org/10.3390/electronics12173664

Downloads


Crossmark Updates

How to Cite

Rafandi, R., Nabilla, A., Siregar, D. A., & Hariyanto, E. (2026). Field Evaluation of a YOLOv8-Based Drone Video Prototype for Real-Time Tiger Detection and Early Warning. Sinkron : Jurnal Dan Penelitian Teknik Informatika, 10(2), 1231-1242. https://doi.org/10.33395/sinkron.v10i2.16146