Using a standard RGB and AI camera to capture vegetation data

A standard RGB camera attached to a drone, combined with deep learning artificial intelligence, can provide color maps of crop health, a new study shows. Photo: Diane Godwin. Created by: Diane Godwin.

Aerial photography is a valuable component of precision agriculture, providing farmers with important information about crop health and yield. The images are usually taken with an expensive multispectral camera attached to a drone. But a new study from the University of Illinois and Mississippi State University (MSU) shows that images from a standard red-green-blue (RGB) camera combined with deep learning artificial intelligence can provide equivalent crop prediction tools for a fraction of the cost.

Multispectral cameras provide color maps representing vegetation to help farmers monitor plant health and identify problem areas. Vegetation indices such as Normalized Difference Vegetation Index (NDVI) and Normalized Difference Red Edge Index (NDRE) show healthy areas as green, while problem areas are shown in red.

“Typically, to do this you would need a near-infrared (NIR) camera that costs about $5,000. But we’ve demonstrated that we can train AI to create NDVI-like imagery using an RGB camera attached to a low-cost drone, and that significantly reduces the cost,” says Girish Chowdhary, associate professor in the Department of Agricultural and Biological Engineering at the U of I and co-author of the paper.

For this study, the research team collected aerial images of corn, soybean, and cotton fields at various stages of growth with both multispectral and RGB cameras. They used Pix2Pix, a neural network designed for image transformation, to translate the RGB images into NDVI and NDRE color maps with red and green areas. After first training the network with a large number of multispectral and normal images, they tested its ability to generate NDVI/NDRE images from another set of normal images.

“There’s a green reflectance in the photos that indicates photosynthetic performance. It reflects a little in the green channel and a lot in the near-infrared channel. But we’ve built a network that can extract it from the green channel by training it in the NIR channel. That means that we only need the green channel, along with other contextual information such as red, blue and green pixels,” explains Chowdhary.

To test the accuracy of the AI-generated images, the researchers asked a panel of crop experts to view side-by-side images of the same areas, either AI-generated or taken with a multispectral camera. The experts indicated whether they could tell what the true multispectral image was and whether they noticed differences that would influence their decision making.

The experts found no observable differences between the two sets of images and indicated that they would make similar predictions from both. The research team also examined the comparison of images through statistical procedures, confirming that there were essentially no measurable differences between them.

Joby Czarnecki, associate research professor at MSU and co-author of the paper, cautions that this does not mean the two sets of images are identical.

“Although we cannot say that the images will provide the same information under all circumstances, for this particular issue, they allow similar decisions. Near-infrared reflectance can be very critical for some plant decisions. However, in this particular case , it is exciting that our study shows that you can replace an expensive technology with cheap artificial intelligence and come to the same decision,” he explains.

The aerial view can provide information that is difficult to obtain from the ground. For example, areas of storm damage or nutrient deficiencies may not be easily visible at eye level, but can be easily identified from the air. Farmers with the proper authorizations can choose to fly their own drones or outsource to a private company to do so. Either way, color maps provide important crop health information needed for management decisions.

The AI ​​software and processes used in the study are available for companies that want to implement it or expand its use by training the network on additional datasets.

“There is a lot of potential for AI to help reduce costs, which is a key driver for many applications in agriculture. If you can make a $600 drone more useful, then everyone can have access. And the information will help farmers improve yield and be better stewards of their land,” concludes Chowdary.

The Department of Agricultural and Biological Engineering is located in the College of Agricultural, Consumer and Environmental Sciences and the Grainger College of Engineering at the University of Illinois.

The paper, “NDVI/NDRE prediction from standard RGB aerial imagery using deep learning”, is published in Computers and Electronics in Georgia.

More information:
Corey Davidson et al, NDVI/NDRE prediction from standard RGB aerial imagery using deep learning, Computers and Electronics in Georgia (2022). DOI: 10.1016/j.compag.2022.107396

Provided by the University of Illinois at Urbana-Champaign

Reference: Using a standard RGB camera and artificial intelligence to capture vegetation data (2023, March 11) Retrieved March 11, 2023, from vegetation.html

This document is subject to copyright. Except for any fair dealing for purposes of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Reply

Your email address will not be published. Required fields are marked *