KEYWORDS: Photogrammetry, RGB color model, Detection and tracking algorithms, Algorithm development, Agriculture, Systems modeling, Near infrared, Cameras, Accuracy assessment, Atomic force microscopy, Clouds, Genetics
Lodging has been recognized as one of the major destructive factors for crop quality and yield, particularly in corn. A variety of contributing causes, e.g. disease and/or pest, weather conditions, excessive nitrogen, and high plant density, may lead to lodging before harvesting season. Traditional lodging detection strategies mainly rely on ground data collection, which is insufficient in efficiency and accuracy. To address this problem, this research focuses on the use of unmanned aircraft systems (UAS) for automated detection of crop lodging. The study was conducted over an experimental corn field at the Texas A and M AgriLife Research and Extension Center at Corpus Christi, Texas, during the growing season of 2016. Nadir-view images of the corn field were taken by small UAS platforms equipped with consumer grade RGB and NIR cameras on a per week basis, enabling a timely observation of the plant growth. 3D structural information of the plants was reconstructed using structure-from-motion photogrammetry. The structural information was then applied to calculate crop height, and rates of growth. A lodging index for detecting corn lodging was proposed afterwards. Ground truth data of lodging was collected on a per row basis and used for fair assessment and tuning of the detection algorithm. Results show the UAS-measured height correlates well with the ground-measured height. More importantly, the lodging index can effectively reflect severity of corn lodging and yield after harvesting.
Unmanned aerial vehicles (UAVs) have advantages over manned vehicles for agricultural remote sensing. Flying UAVs is less expensive, is more flexible in scheduling, enables lower altitudes, uses lower speeds, and provides better spatial resolution for imaging. The main disadvantage is that, at lower altitudes and speeds, only small areas can be imaged. However, on large farms with contiguous fields, high-quality images can be collected regularly by using UAVs with appropriate sensing technologies that enable high-quality image mosaics to be created with sufficient metadata and ground-control points. In the United States, rules governing the use of aircraft are promulgated and enforced by the Federal Aviation Administration (FAA), and rules governing UAVs are currently in flux. Operators must apply for appropriate permissions to fly UAVs. In the summer of 2015 Texas A&M University's agricultural research agency, Texas A&M AgriLife Research, embarked on a comprehensive program of remote sensing with UAVs at its 568-ha Brazos Bottom Research Farm. This farm is made up of numerous fields where various crops are grown in plots or complete fields. The crops include cotton, corn, sorghum, and wheat. After gaining FAA permission to fly at the farm, the research team used multiple fixed-wing and rotary-wing UAVs along with various sensors to collect images over all parts of the farm at least once per week. This article reports on details of flight operations and sensing and analysis protocols, and it includes some lessons learned in the process of developing a UAV remote-sensing effort of this sort.
Recent development of unmanned aerial systems has created opportunities in automation of field-based high-throughput phenotyping by lowering flight operational cost and complexity and allowing flexible re-visit time and higher image resolution than satellite or manned airborne remote sensing. In this study, flights were conducted over corn and sorghum breeding trials in College Station, Texas, with a fixed-wing unmanned aerial vehicle (UAV) carrying two multispectral cameras and a high-resolution digital camera. The objectives were to establish the workflow and investigate the ability of UAV-based remote sensing for automating data collection of plant traits to develop genetic and physiological models. Most important among these traits were plant height and number of plants which are currently manually collected with high labor costs. Vegetation indices were calculated for each breeding cultivar from mosaicked and radiometrically calibrated multi-band imagery in order to be correlated with ground-measured plant heights, populations and yield across high genetic-diversity breeding cultivars. Growth curves were profiled with the aerial measured time-series height and vegetation index data. The next step of this study will be to investigate the correlations between aerial measurements and ground truth measured manually in field and from lab tests.
Seth Murray, Leighton Knox, Brandon Hartley, Mario Méndez-Dorado, Grant Richardson, J. Alex Thomasson, Yeyin Shi, Nithya Rajan, Haly Neely, Muthukumar Bagavathiannan, Xuejun Dong, William Rooney
The next generation of plant breeding progress requires accurately estimating plant growth and development parameters to be made over routine intervals within large field experiments. Hand measurements are laborious and time consuming and the most promising tools under development are sensors carried by ground vehicles or unmanned aerial vehicles, with each specific vehicle having unique limitations. Previously available ground vehicles have primarily been restricted to monitoring shorter crops or early growth in corn and sorghum, since plants taller than a meter could be damaged by a tractor or spray rig passing over them. Here we have designed two and already constructed one of these self-propelled ground vehicles with adjustable heights that can clear mature corn and sorghum without damage (over three meters of clearance), which will work for shorter row crops as well. In addition to regular RGB image capture, sensor suites are incorporated to estimate plant height, vegetation indices, canopy temperature and photosynthetically active solar radiation, all referenced using RTK GPS to individual plots. These ground vehicles will be useful to validate data collected from unmanned aerial vehicles and support hand measurements taken on plots.
Conference Committee Involvement (10)
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping X
13 April 2025 | Orlando, Florida, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IX
22 April 2024 | National Harbor, Maryland, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VIII
1 May 2023 | Orlando, Florida, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VII
4 April 2022 | Orlando, Florida, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping VI
12 April 2021 | Online Only, Florida, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping V
27 April 2020 | Online Only, California, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping IV
15 April 2019 | Baltimore, MD, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping III
18 April 2018 | Orlando, FL, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping II
10 April 2017 | Anaheim, CA, United States
Autonomous Air and Ground Sensing Systems for Agricultural Optimization and Phenotyping
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.