Real-time early detection of weed plants in pulse crop field using drone with IoT
Main Article Content
Abstract
The real-time detection of parthenium weed plants in the pulse crop field was carried out using low altitude flying drone. A fully convolutional semantic segmentation model was proved to accurately perform object segmentation with higher time complexity. In this research, the LinkNet model with Resnet-34 was used for real-time detection of weed plants using a video feed from low altitude flying drones. Experimental results is proven that LinkNet-34 can detect overlapping and irregular shape weed objects at 0.86 mean pixel accuracy of 0.598 mean IoU at 0.217s. The processing speed was better compared to LinkNet and U-Net models. The detected weed images were stitched together to create a weed site map. The created map is automatically uploaded to google cloud for further site analysis.
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.
References
Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., Devin, M., Ghemawat, S., Irving, G., Isard, M., Kudlur, M., Levenberg, J., Monga, R., Moore, S., Murray, D.G., Steiner, B., Tucker, P.A., Vasudevan, V., Warden, P., Wicke, M., Yu, Y. and Zhang, X. (2016). TensorFlow: A system for large-scale machine learning. Proceedings of the 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI ’16), 265-283.
Abhishek, C. and Eugenio, C. (2017). LinkNet: Exploiting Encoder Representations for Efficient Semantic Segmentation. IEEE Visual Communications and Image Processing (VCIP) DOI:10.1109/VCIP.2017.8305148.
Annual Report (2017-18). Directorate of Pulses Development, Ministry of Agriculture & Farmers Welfare Government of India. Retired from http://dpd.gov.in/Annual%20Report%202017-18.pdf.
Agarwal, A., Jawahar, C. V. and Narayanan, P. J. (2005). A Survey of Planar Homography Estimation Techniques. Retired from http://citeseerx.ist.psu.edu/viewdoc/download? doi=10.1.1.102.321&rep=rep1&type=pdf.
Borra-Serrano, I., Peña J. M., Torres-Sa´nchez, J., Mesas-Carrascosa, F. J. and Lo´pez-Granados, F. (2015). Spatial Quality Evaluation of Resampled Unmanned Aerial Vehicle-Imagery for Weed Mapping. Sensors (Basel), 15:19688-19708.
Burlina, P, editor MRCNN: A stateful Fast R-CNN (2016). 23rd International Conference on Pattern Recognition (ICPR), 12:4-8.
Cetinkaya, O., Sandal, S., Bostancı, E., Güzel, M.S., Osmanoglu, M. and Kanwal, N. (2019). A Fuzzy Rule-Based Visual Human Tracking System for Drones. 2019 4th International Conference on Computer Science and Engineering (UBMK) 1-6.
Arvind, C. S., Prajwal, R., Bhat, P. N., Sreedevi, A. and Prabhudeva, K. N. (2019). Fish Detection and Tracking in Pisciculture Environment using Deep Instance Segmentation. TENCON 2019 - 2019 IEEE Region 10 Conference (TENCON), Kochi, India, pp. 778-783. DOI: 10.1109/TENCON.2019.8929613.
Dinatha, R. C., Sukarsa, I. M. and Cahyawan, A. (2016). Data Exchange Service using Google Drive API. International Journal of Computer Applications, 154:12-16.
Erhan, D., Szegedy, C., Toshev, A. and Anguelov, D. (2013). Scalable Object Detection Using Deep Neural Networks. IEEE Conference on Computer Vision and Pattern Recognition arXiv:1312.2249v1.
Girshick, R., Donahue, J., Darrell, T. and Malik, J. (2014). Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. IEEE Conference on Computer Vision and Pattern Recognition arXiv:1311.2524v5.
Gollapudi, S. (2019). Learn Computer Vision Using OpenCV with Deep Learning CNNs and RNNs. 151 p.
He, K., Zhang, X., Ren, S. and Sun, J. (2015). Deep Residual Learning for Image Recognition. (2016). IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 770-778.
Huang, H., Deng, J., Lan, Y., Yang, A., Deng, X. and Wen, S. (2018). Accurate Weed Mapping and Prescription Map Generation Based on Fully Convolutional Networks Using UAV Imagery. Sensors 18.
Jensen, H. G., Jacobsen, L. B., Pedersen, S. M. and Tavella, E. (2012). Socioeconomic impact of widespread adoption of precision farming and controlled traffic systems in Denmark. Precision Agriculture, 13:661-77. https://doi.org/10.1007/s11119-012-9276-3.
Krizhevsky, A., Sutskever, I. E. and Hinton, G. (2012). ImageNet Classification with Deep Convolutional Neural Networks. International Conference on Neural Information Processing Systems, 25:1097-105.
Krizhevsky, A., Sutskever, I. and Hinton, G. E. (2017). ImageNet classification with deep convolutional neural networks. Commun. ACM, 60:84-90.
Lindeberg, T. (2012). Scale Invariant Feature Transform. Scholarpedia, 7:10491.
Lopez-Granados, F., Torres-Sanchez, J., De Castro, A. I., Serrano-Perez, A., Mesas-Carrascosa, F. J. and Pena, J. M. (2016). Object-based early monitoring of a grass weed in a grass crop using high-resolution UAV imagery.Agronomy for Sustainable Development, 36:67. 10.1007/s13593-016-0405-7.
Manpreet, K., Aggarwal, N. K., Kumar, V. and Dhiman, R. (2014). Effects and Management of Partheniumhysterophorus: A Weed of Global Significance. International Scholarly Research Notices, vol. 2014, Article ID 368647, 12 p. https://doi.org/10.1155/2014/368647.
Ma, X., Deng, X., Qi, L., Jiang, Y., Li, H., Wang, Y. and Xing, X. (2019). Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields. PLoS ONE, 14.
Munesh, K., Kumar, S. and Sheikh, M. A. (2010). Effect of Partheniumhysterophorus ash on growth and biomass of Phaseolusmung, 9:145-148.
Perez-Ortiz, M., Pena, J. M., Gutierrez, P. A., Torres-Sa´nchez, J., Hervas-Martınez, C. and Lopez-Granado, F. (2016). Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery. Expert Systems with Applications, 47:85-94. https://doi.org/10.1016/j.eswa.2015.10.043.
Redmon, J., Divvala, S., Girshick, R. and Farhadi, A. (2016). You only look once: Unified, Real-Time Object Detection. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) arXiv:1506.02640v5,5.
Richardson, I. E. (2003). H.264 and MPEG-4 Video Compression: Video Coding for Next-Generation Multimedia, pp.137.
Ronneberger, O., Fischer, P. and Brox, T. (2015). U-Net: Convolutional networks for biomedical image segmentation. International Conference on Medical Image Computing and Computer-Assisted Intervention arXiv:1505.04597v1.
Schnabel, R., Wahl, R. and Klein, R. (2007). Efficient RANSAC for point-cloud shape detection. Comput. Graph. Forum, 26:214-226.
Shelhamer, E., Long, J. and Darrell, T. (2017). Fully convolutional networks for semantic segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 39:640-51. https://doi.org/10.1109/ TPAMI.2016.2572683 PMID: 27244717.
Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V. and Rabinovich, A. (2014). Going deeper with convolutions. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) arXiv:1409.4842v1,9.
Tellaeche, A., Pajares, G., Burgos-Artizzu, X. P. and Ribeiro, A. (2011). A computer vision approach for weeds identification through Support Vector Machines. Applied Soft Computing, 11:908-15. https://doi.org/10.1016/j.asoc.2010.01.011.
Tshewang, S., Sindel, B. M., Ghimiray, M. and, Chauhan, B. S. (2016). Weed management challenges in rice (Oryza sativa L.) for food security in Bhutan: A review. Crop Protection, 90:117-24.
Wei, S., Xinggang, W., Yan, W., Xiang, B. and Zhang, Z. (2015). DeepContour: A deep convolutional feature learned by positive-sharing loss for contour detection. 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR); 7–12 June 2015.
Wiles, L. J. (2009), Beyond patch spraying: site-specific weed management with several herbicides. Precision Agriculture, 10:277-90.
Zhou, L., Zhang, C. and Wu, M. (2018). D-LinkNet: LinkNet with Pretrained Encoder and Dilated Convolution for High-Resolution Satellite Imagery Road Extraction. 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) DOI: 10.1109/CVPRW.2018.00034.