Affiliations: |
Vehicle Systems & Control Laboratory (VSCL) – Department of Aerospace Engineering
|
Project Leader: | Vinicius Guimaraes Goecks vinicius.goecks@tamu.edu Aerospace Engineering |
Faculty Mentor: |
Dr. John Valasek, Ph.D.
|
Meeting Times:
|
Wed 11am-noon (subject to change to fit team member’s needs)
|
Team Size:
|
7 (Team Full)
|
Open Spots: | 0 |
Special Opportunities:
|
|
Team Needs:
|
There are two main development areas at this moment: 1) Scale down UAS platform Description: We currently fly a vehicle that requires a runaway to takeoff and land. We would like to replicate our current system on a small aircraft. We already have the small aircraft, but need to investigate how to design a flight computer system that meets the maximum payload of the aircraft. Needs: Familiarity with (or desire to learn about) RaspberryPi or other single-board computer, UAS hardware, Python, and Linux. 2) Improve simulation environment and training of intelligent agents Description: Our current agent controls only the roll rate of the aircraft. The ideal learning agent would also controls throttle and pitch rate. To make this work, we need to develop a simulated environment that is able to simulate a linear aircraft model and a learning algorithm that will be able to communicate with it. Needs: Familiarity with (or desire to learn about) Llearning algorithms, Python, and Linux.
|
Description:
|
Surveillance and visual tracking of ground targets using Unmanned Air Systems (UAS) is challenging when the camera is strapdown, or fixed to the airframe without a pan-and-tilt capability, rather than gimbaled, so that the vehicle must be steered to orient the camera field of view. Visual tracking is even more difficult when the target follows an unpredictable path. In this project, we fly a fixed-wing small UAS with a fixed, strapdown, camera to detect and follow a target (generally, a moving vehicle) based on camera information. We use machine learning algorithms (specifically, reinforcement learning) to train an intelligent agent to control the UAS in order to keep the target in the image frame. The agent is trained on a simulated environment and later deployed to the real UAS. Please check our website for videos and more information: https://proxy.qualtrics.com/proxy/?url=https%3A%2F%2Fsites.google.com%2Ftamu.edu%2Fvscl-tracking&token=d9w7ZmXpFbv5f%2BnRdtvL%2FOzdjRmQlTT%2BIDsrTeRfiew%3D
|