Xiaoyue (Zoe) Cheng
Department of Mathematics
University of Nebraska at Omaha
Recognition of Crop Alignment for Unmanned Aerial Vehicle Imagery Data
Recent advances in Unmanned Aerial Vehicle (UAV) technologies prospered aerial imagery data collection and information extraction in a wide range of areas including the high throughput phenotyping in plant breeding. At present, most numerical measurement of phenotypic traits is still performed in a traditional approach by human experts, making the process expensive and time consuming and hence difficult to scale up. A modern direction is to use the UAV high resolution imagery data for inferring the crop phenotypic variations in a timely manner. To convert the imagery data into phenotypic/genotypic data, we need to label each crop with its genotype and measure the traits of interest. One of the challenges in the new approach is to match the crops in the image with their genotypes based on the experiment layout in the field. As all the crops are planted in rows, and every a few rows of crops belong to a genotype, row identification is crucial to genotype annotation. The goal of this research is to develop methods for automatically recognizing crop alignment from aerial images. Different from the pixel-wise semantic image segmentation or region-based object detection in machine learning, our task needs to recognize a queue of conceptual shapes (segments) without pre-labelled annotation. We proposed a test-based approach to find the best linear separators for crops, and developed a window-shifting method to locate the rows of crops. This work is collaborated with Yang Lab and Schnable Lab from University of Nebraska Lincoln.