• Skip to primary navigation
  • Skip to main content

Aggie Research Programs

Texas A&M University

  • Research Leadership
  • Undergraduates
  • Project List
  • Team Leader Resources
  • Contacts
  • Calendar
  • FAQs
  • Show Search
Hide Search

Summer 2024: Comparing the Performance of Multi-modal Deep Learning for Cover Crop and Weed Biomass Estimation

Affiliations: Aggie Research Mentoring Program
Project Leader: Joe Johnson

joejohnson2905@tamu.edu

Soil and Crop Sciences

Faculty Mentor: Muthukumar Bagavathiannan, Ph.D.
Meeting Times:
TBA
Team Size:
8
Open Spots: 0
Special Opportunities:
Students will learn new skills like: 1. basics of robotics and machine learning, 2. Working of different robotic sensors, 3. Basics of computer vision and 3D data collection.
Team Needs:
Willingness to work in agricultural field for data collection, driving vehicle, basics knowledge of operating computer and programming, knowledge of using mechanical tools.
Description:
This project would great opportunity to learn basics of robotics and machine learning in agriculture. Participating students will involve in data collection, field trips, troubleshooting and much more. More details of the project are the following. Cover crop biomass production can be highly variable under field conditions due to microsite variabilities. Effective estimation and mapping of cover crop performance and biomass production across large field areas is highly valuable for predicting areas of poor weed suppression and plan for subsequent management in a site-specific fashion. In this regard, the use of sensors and object localization applications can be beneficial; two prominent data sources for this purpose include optical imagery and the Light Detection and Ranging (LiDAR) point cloud. Both data sources have their unique characteristics that make them useful in specific field applications. Different deep learning methods suitable for efficiently utilizing the complementary characteristics of both data sources were used for fusing the multi-modal data effectively. In this research, an autonomous cartesian robotic system was designed and developed for multi-modal data collection over a high-performance field-based wheeled robotic platform. Using this data, different data features like canopy spectral reflectance, structure, texture, and category information derived from multiple sensors are investigated for plant biomass prediction within the framework of multi-modal data fusion and deep learning. The proposed data collection pipeline and processing framework achieve satisfactory performance. Here, we successfully demonstrate the potential of multi-modal deep learning for cover crop and weed biomass estimation in agricultural fields, and such information can be useful for site-specific management.

Written by:
América Soto-Arzat
Published on:
April 10, 2024

Categories: FullTags: Summer 2024

Footer

Texas A&M University  |  Web Accessibility  |  Site Policies  |  Site Support

© 2021, Website by CVMBS Communications, Texas A&M College of Veterinary Medicine & Biomedical Sciences