continued on page 8 ONTARIO GRAIN FARMER COVER STORY 7 But AI can also take new visual data and analyze it on the fly to help farmers (and breeders and researchers in some cases) in exciting new ways. That’s the current angle in the work of Riley McConachie, a PhD student with Dr. Helen Booker in the Department of Plant Agriculture at the University of Guelph. McConachie recently completed his master’s degree, where he successfully proved that AI can accurately count the number of wheat heads in an image. He also trained the AI to produce a Fusarium head blight (FHB) severity index. The associated app he created, called WheatScanR, was released in August for free download. McConachie has already won numerous awards for his hard work, but he’s not stopping there. Now, in his PhD, he is retraining the AI to improve it, working to add a yield prediction capability, and more. The scope of the research includes using images taken by grain farmers who are part of the Great Lakes Yield Enhancement Network (YEN). YEN participants have had to do head counts manually, like researchers and breeders do, which takes a long time and isn’t highly accurate (results can vary from person to person). With McConachie’s work, farmers will be able to simply take pictures, and the app will quickly and objectively count them. LEARNING TO COUNT To “train” any AI to do any task, in this case, counting how many of a given item are present in an image, you use the same steps you’d use to train a person. First, the AI must recognize the item of interest. “We put boxes around wheat heads in an image to single them out, and it learns that these are wheat heads through identifying their shape, features and textures,” McConachie explains. “About 70 per cent of total annotated data is devoted to that task, building a solid foundation. Then the AI uses another roughly 20 per cent of the data to apply what it learned, to validation. It will look at new images, do a head count for each image and determine how accurate that is by comparing its results to the actual results, where the number of wheat heads in the same images has been tagged by me or someone else. It will then improve its own accuracy as needed by going back to the images where accuracy was not high.” The other 10 per cent of data is used for “true testing.” This occurs after training is essentially complete, and the AI is presented with new images with no results to compare to. (And in case you’re wondering, this AI is available free to anyone from YOLO. Processing time for this project is purchased from a remote server through a plan, as that’s much more economical than a university installing an appropriate server.) At this point, farmers can use WheatScanR to obtain a head count per area pictured, and they can also use those results to make their own yield predictions (using either existing images or new ones, all of which remain private). However, McConachie is developing the AI’s capability to do early yield estimation of its own. “Of course, there are many other things like weather and disease that can affect yield, so our yield estimator will be simple,” he explains. “But we actually need to keep it simple for it to work well. I think we’ll use three variables, the number of heads present, the number of kernels on the heads, and the historical test weight for the variety.” However, as mentioned, WheatScanR also already has an instantaneous measure of FHB severity, where the AI counts infected heads versus non-infected heads in the provided images. Yes, this severity index could also be used in the yield estimation, says McConachie, but you’d want to make sure it was very accurate before you’d integrate it with a yield estimation tool that’s already working well. “At this point, farmers can use WheatScanR to obtain a head count per area pictured, and they can also use those results to make their own yield predictions.
RkJQdWJsaXNoZXIy MTQzODE4