Automatic estimation of heading date of paddy rice using deep learning
Tóm tắt
Accurate estimation of heading date of paddy rice greatly helps the breeders to understand the adaptability of different crop varieties in a given location. The heading date also plays a vital role in determining grain yield for research experiments. Visual examination of the crop is laborious and time consuming. Therefore, quick and precise estimation of heading date of paddy rice is highly essential. In this work, we propose a simple pipeline to detect regions containing flowering panicles from ground level RGB images of paddy rice. Given a fixed region size for an image, the number of regions containing flowering panicles is directly proportional to the number of flowering panicles present. Consequently, we use the flowering panicle region counts to estimate the heading date of the crop. The method is based on image classification using Convolutional Neural Networks. We evaluated the performance of our algorithm on five time series image sequences of three different varieties of rice crops. When compared to the previous work on this dataset, the accuracy and general versatility of the method has been improved and heading date has been estimated with a mean absolute error of less than 1 day. An efficient heading date estimation method has been described for rice crops using time series RGB images of crop under natural field conditions. This study demonstrated that our method can reliably be used as a replacement of manual observation to detect the heading date of rice crops.
Tài liệu tham khảo
Yoshida S. Fundamentals of rice crop science. Los Baños: International Rice Research Institute; 1981. p. 56.
Takeoka Y, Shimizu M, Wada T. Morphology and development of reproductive organs. In: Matsuo T, Hoshikawa K, editors. Science of the rice plant. Volume 1: morphology. Tokyo: Food and Agriculture Policy Research Center; 1993. p. 293–412.
Gao H, Jin M, Zheng XM, et al. Days to heading 7, a major quantitative locus determining photoperiod sensitivity and regional adaptation in rice. Proc Natl Acad Sci. 2014;111(46):16337–42. https://doi.org/10.1073/pnas.1418204111
Hu Y, Li S, Xing Y. Lessons from natural variations: artificially induced heading date variations for improvement of regional adaptation in rice. Theor Appl Genet. 2019;132(2):383–94. https://doi.org/10.1007/s00122-018-3225-0.
Okada R, Nemoto Y, Endo-Higashi N, Izawa T. Synthetic control of flowering in rice independent of the cultivation environment. Nat Plants. 2017;3:17039.
Yano M, Kojima S, Takahashi Y, Lin H, Sasaki T. Genetic control of flowering time in rice, a short-day plant. Plant Physiol. 2001;127(4):1425–9.
Zhang Z-H, Zhu Y-J, Wang S-L, Fan Y, Zhuang J-Y. Importance of the interaction between heading date genes hd1 and ghd7 for controlling yield traits in rice. Int J Mol Sci. 2019;20:516. https://doi.org/10.3390/ijms20030516.
Schmidhuber J. Deep learning in neural networks: an overview. Neural Netw. 2015;61:85–117.
Ribera J, Chen Y, Boomsma C, Delp EJ. Counting plants using deep learning. In: 2017 IEEE global conference on signal and information processing; 2017. p. 1344–1348. https://doi.org/10.1109/GlobalSIP.2017.8309180
Ghosal S, Blystone D, Singh AK, Ganapathysubramanian B, Singh A, Sarkar S. An explainable deep machine vision framework for plant stress phenotyping. Proc Natl Acad Sci. 2018;115(18):4613–8.
Ise T, Minagawa M, Onishi M. Identifying 3 moss species by deep learning, using the “chopped picture” method. Open J Ecol. 2017. https://doi.org/10.4236/oje.2018.83011.
Sa I, Ge Z, Dayoub F, Upcroft B, Perez T, McCool C. Deepfruits: a fruit detection system using deep neural networks. Sensors. 2016;16(8):1222.
Xiong X, Duan L, Liu L, Tu H, Yang P, Wu D, Chen G, Xiong L, Yang W, Liu Q. Panicle-seg: a robust image segmentation method for rice panicles in the field based on deep learning and superpixel optimization. Plant Methods. 2017;13(1):104.
Bai X, Cao Z, Zhao L, Zhang J, Lv C, Li C, Xie J. Rice heading stage automatic observation by multi-classifier cascade based rice spike detection method. Agric For Meteorol. 2018;259:260–70. https://doi.org/10.1016/j.agrformet.2018.05.001.
Hasan MM, Chopin JP, Laga H, Miklavcic SJ. Detection and analysis of wheat spikes using convolutional neural networks. Plant Methods. 2018;14(1):100. https://doi.org/10.1186/s13007-018-0366-8.
Pound MP, Atkinson JA, Wells DM, Pridmore TP, French AP. Deep learning for multi-task plant phenotyping. In: 2017 IEEE international conference on computer vision workshops (ICCVW); 2017. p. 2055–2063. https://doi.org/10.1109/ICCVW.2017.241
Kamilaris A, Prenafeta-Boldú FX. Deep learning in agriculture: a survey. Comput Electron Agric. 2018;147:70–90. https://doi.org/10.1016/j.compag.2018.02.016.
Zhu Y, Cao Z, Lu H, Li Y, Xiao Y. In-field automatic observation of wheat heading stage using computer vision. Biosyst Eng. 2016;143:28–41.
Guo W, Fukatsu T, Ninomiya S. Automated characterization of flowering dynamics in rice using field-acquired time-series RGB images. Plant Methods. 2015;11:7.
Lowe DG. Distinctive image features from scale-invariant keypoints. Int J Comput Vis. 2004;60(2):91–110.
Selvaraju RR, Cogswell M, Das A, Vedantam R, Parikh D, Batra D. Grad-cam: visual explanations from deep networks via gradient-based localization. In: 2017 IEEE international conference on computer vision (ICCV); 2017. p. 618–626. https://doi.org/10.1109/ICCV.2017.74
Fukatsu T, Hirafuji M, Kiura T. An agent system for operating web-based sensor nodes via the internet. J Robot Mechatron. 2005;18:186.
He K, Zhang X, Ren S, Sun J. Deep residual learning for image recognition. In: 2016 IEEE conference on computer vision and pattern recognition (CVPR); 2016. p. 770–778. https://doi.org/10.1109/CVPR.2016.90
Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, Huang Z, Karpathy A, Khosla A, Bernstein M, Berg AC, Fei-Fei L. Imagenet large scale visual recognition challenge. Int J Comput Vis. 2015;115(3):211–52.
Ren S, He K, Girshick R, Sun J. Faster r-cnn: towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intell. 2017;39(6):1137–49. https://doi.org/10.1109/TPAMI.2016.2577031.