Poster in Mar 07, 2023 03:00:34

Researchers in China and Singapore recently developed an approach for calculating and replacing rice

Researchers in China and Singapore recently developed an approach for calculating and replacing rice

Photo: Collected 

Rice, a major food crop, is cultivated on about 162 million hectares of land worldwide. One of the most used methods for the amount of paddy production is the calculation of rice plants. This technique is used to estimate yield, diagnose growth and evaluate damage in paddy fields. Most rice counting processes around the world are still manually driven. However, it is extremely exhausting, laborious and time-consuming, it indicates the need for a quick and more efficient machine-based solution.

Researchers from China and Singapore have recently developed a method to replace manual rice counting with a much more sophisticated method, involving the use of unmanned aerial vehicles (UAVs) or drones.

According to Professor Jianguo Yao from Nanjing University of Posts and Telecommunications in China, who led the study, "The new technique uses UAVs to capture RGB images—images composed primarily with red, green, and blue light—of the paddy field. These images are then processed using a deep learning network that we have developed, called RiceNet, which can accurately identify the density of rice plants in the field, as well as provide higher-level semantic features, such as crop location and size."

Their paper has been published in Plant Phenomics.

The RiceNet network architecture consists of one feature extractor, at the front end, that analyzes the input images, and three feature decoder modules that are responsible for estimating the density of plants in the paddy field, the location of plants in the paddy field, and the size of the plants, respectively. The latter two features are particularly important for future research on automated crop management techniques, such as fertilizer spraying.

As a part of the study, the research team deployed a camera-equipped UAV over rice fields in the Chinese city of Nanchang and subsequently analyzed the acquired data using a sophisticated image analysis technique. Next, the researchers employed a training dataset and a test dataset. The former was used as a reference to train the system and the latter was used to validate the analytical findings.

More specifically, out of the 355 images with 257,793 manually labeled points, 246 were randomly selected and used as training images, whereas the remaining 109 were used as test images. Each image contained an average of 726 rice plants.

According to the team, the RiceNet technique used for image analysis has a good signal-to-noise ratio. In other words, it is able to efficiently distinguish rice plants from the background, thus improving the quality of the generated plant density maps.

The results of the study showed that the mean absolute error and root mean square error of the RiceNet technique were 8.6 and 11.2, respectively. In other words, the density maps generated using RiceNet were in good agreement with those generated using manual methods.

Moreover, based on their observations, the team also shared a few key recommendations. For instance, the team does not recommend acquiring images on rainy days. It also suggests collecting UAV-based images within a period of 4 hours following sunrise, so as to minimize fog time as well as the occurrence of rice leaf curls, both of which adversely affect the output quality.

"In addition to this, we further validated the performance of our technique using two other popular crop datasets. The results showed that our method significantly outperforms other state-of-the-art techniques. This underscores the potential of RiceNet to replace the traditional method of manual rice counting," concludes Professor Yao.

RiceNet further paves the way toward other UAV- and deep learning-based crop analysis techniques, which can in turn guide decisions and strategies to improve the production of food and cash crops worldwide.

Source:
Online/GFMM

Comment Now

Latest Publication