By Dusty Sonnenberg, CCA, Ohio Field Leader: a project of the Ohio Soybean Council and soybean checkoff.

Photo Credit: Christopher Wiegman, The Ohio State University

In Precision Agriculture, as in life, having an idea of how something could work is one thing. Actually, making the idea work is another. “Chris Wiegman, a PhD student in our department, working with students from electrical and computer engineering at Ohio State, set to work using a Graphical Processing Unit (GPU) with several cameras, to try to make this all happen,” said Dr. Scott Shearer, Professor, and Chair of the Department of Food, Agricultural and Biological Engineering at The Ohio State University. Shearer is working with working with Dr. Mark Loux, Extension Professor in Horticulture and Crop Science, also at Ohio State. “The GPU we are using measures about 4”x4”x1” and costs about $100. We also mounted multiple small cameras in a shell with the GPU suspended from the stinger on the drone. Each camera only cost about $27, so the entire set-up is under $500. We then train the GPU to analyze these images. The image thru-put rate for this GPU is up to 38 images per second. As we fly, we are taking multiple images. Those images can be processed at a rate that is near real-time. The drone we have has a flight time of 15 minutes. We estimate we can cover 40-100 acres in a single flight depending on the elevation and the type of resolution needed.”

Putting all the pieces together is where the actionable information becomes real for the farmers. All the images taken are geo-tagged, (have GPS referenced coordinates). The areas of concern identified by the GPU can be mapped and returned to accurately for treatment. “We are using all this technology and flying over the crop and using the artificial intelligence to tell us if we have glyphosate resistant weeds that have escaped.

Photo Credit: Christopher Wiegman, The Ohio State University

The hope is once a map is built, we can program a drone to go back and fly with this limited amount of spray mixture and treat those weed escapes,” said Shearer. “The hope is that if we can do it early enough before harvest, that sets the farmer up for success the following year to still be able to plant glyphosate resistant crops. This is one way we could extend the life of herbicide trait technologies. It is not the silver bullet, but it is another tool for farmers. This really represents some of the first tangible results of what we can actually do with UAV’s to affect a farmers bottom line.”

As the technology improves, there will still be a need for human interaction. “An important piece to all of this is having a knowledgeable person, most likely a certified crop advisor, to help expand our capabilities with respect to A.I.,” said Shearer. “Using a stinger on a drone with the GPU and cameras, we are able to get into the crop canopy and differentiate about 8-10 different crop stresses. Prior to this, many of those stresses could not be identified until they reached the top of the plant. By that time, it was often too late, and it was more of a postmortem diagnosis,” said Shearer. “Using nitrogen in corn as an example, previously we would only detect areas of the field where the upper portion of the canopy did not have the chlorophyl or the vigor, and we would know it was a N deficiency, and that was why that area did not yield as well. By being able to identify the issues early, when they begin to show up at the bottom of the plant within the lower canopy, we can potentially have time to go back in and take corrective action to benefit that crop.”

The goal of this research is to help farmers better manage cropping issues in an efficient way that is time sensitive and cost effective. “We want to expand the number of tools that farmers have at their disposal to address the issues they face,” said Shearer. “In the case of weed escapes, we are mitigating crop loss in the out years. Spending a little money in 2020 may save much more in 2021. We need to make sure the business model works and that the application of this technology returns the needed value to the farmer.”

As more experience is being gained using the technology, and more is being learned about its capabilities, more possibilities are arising. The technology being used has become more readily available, and cost of the technology has decreased with time. “The military term used for the technology being employed is C.O.T.S. (commercial off the shelf), meaning the various components can be purchased from many retail technology stores,” said Shearer. The ability to make everything work together and achieve the desired results is where the human interaction comes in to play. “With some really talented students and some coding, they make the systems work,” said Shearer. “The technology is readily available and the cost has come down. We are now getting closer to a cost structure that makes senses to agriculture and fits a business model that returns value to the farmer.”

The human interaction is important in classifying the data sets. “We are collecting data sets of 10,000 images, and A.I works pretty well there,” said Shearer. “But someone needs to look at and classify each image and catalogue those as to what the specific issues are. We are building libraries of these classified images. While the technology costs have become very modest, the majority of the expense will be in the human interaction to classify the images. Once the images are classified and organized in libraries, they can be used to retrain the neural network classifiers that are a part of the A.I. systems.”

Dr. Scott Shearer

Finding additional applications that can provide actionable information to the farmer in a cost-effective way is the next goal. “A question I ask is: ‘At what stages do we need to fly a crop to help farmers make decisions about how they are going to manage inputs?’ said Shearer. “One of the first questions that farmers across the state asks is: ‘When I planted the crop, did I get a good enough stand, or do I need to go back and replant, and do I need to replant the entire field or only portions?’ We can retrain drones with an imaging system to recognize just about anything that we want.”

“One of the things we will be looking at in the future is if we can help a farmer by mapping the crop stand after planting, and help him make his replant decisions. If we fly and map a crop stand after planting, we can use the technology to help a farmer make an informed decision regarding the emerged population and if replanting is necessary. We can map the field and identify those areas that a farmer needs to replant. The UAV has a better view of the field and the technology is improving to help identify if a field needs replanted, and where those areas are,” said Shearer.

“If we can look all through the growing season, and identify stage gates, where farmers have to make decisions; and then we can bring this A.I. to bear, I think it changes how we look at UAV’s and the value they can bring to agriculture,” said Shearer.

 

 

 

 

Back to Research