CSCE Capstone
Student Site for Individual and Collaborative Activites
Team 8 – IEEE Robotics
Team Members:
Project Summary:
Every year, the IEEE holds a regional robotics competition for colleges and universities. For us here at the University of Arkansas in region 5, the upcoming 2023 competition will be held in Denver. Our capstone team will be competing in this competition with the Robotics Interdisciplinary Organization of Teams (RIOT), a Registered Student Organization at the University of Arkansas. Our problem is that RIOT wants to compete in the IEEE Regional Competition and needs help designing the software aspects of their project. Our objective will be to help RIOT create a functioning robot and drone pair that will be able to compete in the IEEE competition in April 2023. We will approach the solution to this problem by working cohesively as a Capstone group and cooperating with RIOT to determine the best solutions to our problems. Our goal is to make a significant impact on the engineering community at the University of Arkansas by collaborating with engineers of different disciplines and showcase the ability of its students to design and implement a functional autonomous system.
Reports and Proposals:
Presentations:
Preliminary Proposal Presentation
Preliminary Report Presentation
GitHub and Source Code:
GitHub (ReadMe with instructions on GitHub)
Team Schedule:
Team Attendance and Notes:
Team Tasks and Dates:
- Send intent to compete to IEEE. 10/2
- Understand the rules of the competition and brainstorm possible solutions. 10/2-10/15
- Create an initial design to pick and order the parts necessary. 10/16-10/29
- Gain an understanding of drone operation, such as how to use the SDK to control its movements. 10/31-11/6
- Learn about physical robot designs. 11/14-11/18
- Learn ROS for robotic programming. 11/21-12/2
- Test stereo camera being used for the robot. 12/5-12/9
- Read QR codes from the drone. 1/26-1/29
- Control the drone from a Python file. 1/29
- Look into interfacing with the ultrasonic sensor. 1/27-2/3
- Detect objects from the ground robot. 1/28-2/3
- Operate the stereo camera in Linux. 1/29-2/3
- Create an FSM for the drone. 1/29-2/3
- Get a version of object detection working on the drone. 1/31-2/6
- Migrate all progress onto the Nvidia Jetson. 2/7
- Have the Nvidia Jetson running Python. 2/3-2/10
- The drone flies around the room and can read a QR code. 1/29-2/12
- Center objects in the camera’s frame using the object detection and rotation function. 2/3-2/15
- Connect the drone and the ground robot using the Nvidia Jetson. 2/6-2/17
- Connect the ultrasonic sensor to the Nvidia Jetson. 2/8-2/22
- Find the openings of the box using the stereo camera. 2/14-2/22
- Connect the stereo camera to the Nvidia Jetson. 2/9-2/23
- Save the vector location of a box and QR code. 2/10-2/24
- Install all Python packages and standardize a version of Python on Jetson. 2/12-2/24
- Have the drone fly around the room and record a video. 2/15-2/24
- Fine-tune the object detection model for the box. 2/16-3/2
- Connect the Jetson to the Pi Camera. 2/16-3/3
- Run the object detection model on the stereo camera. 2/17-3/3
- Read a QR code from the ground robot. 2/17-3/3
- Send data from the drone camera to the Nvidia Jetson. 2/20-3/3
- Practice controlling motors from a robot supplied by our sponsor. 2/24-3/3
- Measure the side of a box using the stereo camera. 2/27-3/10
- Create an object detection model for the poison drone. 2/28-3/14
- Have the drone flying and doing flips with the camera on. 3/1-3/15
- Get the distance of an object that is in front of the stereo camera. 3/2-3/16
- Use GPIOs on the practice robot to read data from the sensors. 3/5-3/17
- Transpose a picture of the top of the box. 3/6-3/17
- Connect ODrive to the Jetson. 3/17-3/31
- Start on the hierarchy of the overall program for both the drone and robot. 3/17-4/7
- Integrate all drone functions into one class. 3/28-4/7
- Integrate all ground robots and sensor functions into one class. 3/28-4/7
- Control motors through the ODrive using the Jetson. 4/2-4/16
- Rotate the image of QR using the transpose idea for a higher QR reading rate. 4/2-4/16
- Control motors through the ODrive. 3/29-4/12
- Complete assembly of the ground robot. 4/18
- Integrate the sensors onto the ground robot. 4/18
- Do final testing and debugging of our software. 4/1-4/20
- Complete functionality of the ground robot. 4/15-4/20
- Compete 4/22