Document Type

Restricted Campus Only

Publication Date

4-23-2024

Abstract

The Autonomous Rover Team (ART) was tasked with installing an autonomous navigation system onto an inherited rover frame. The navigation system must be able to plan a course between waypoints in a mapped area while avoiding unmapped obstacles. The final deliverables for this project are a rover with limited movement capabilities that can be autonomously or manually controlled and a simulated rover which has complete movement capabilities. Both rovers should be able to complete a 1600 ft2 course by traveling between three waypoints in a mapped area. The camera systems should detect and avoid all obstacles in its path in real time.

As inherited, the rover frame was unstable and lacked proper control mechanisms. The first goal of the ART was to elevate the rover to an operable condition so that it could withstand the final obstacle course. The rocker-bogie suspension was removed and the legs were attached to the frame, which increased torsional rigidity. Other minor modifications were made to the mechanical and electrical subsystems to increase reliability. To prove the electro-mechanical upgrades increased stability, the ART installed a joystick and created a program for manual operation. The joystick-control test showed that all motors responded correctly to the joystick and that the rover could drive without failure over short distances. After repeated turns, the front and rear legs begin to bow-out from the weight of the frame, forcing the operator to pause the rover and readjust the legs. This problem should be fixed when future teams install a more robust suspension and reinforce the chassis. The ART recommends shrinking the frame and using a more rigid material to construct the arms.

The rover is controlled by a single board computer called the Jetson Nano which uses ROS2 (Robot Operating System) to interface external components with the navigation system. The Jetson runs an older version of ROS2, which is not supported by most libraries. Currently, the team has interfaced the Jetson with the ZED2 camera which detected objects and produced odometry data through a ROS2 node. The ART also interfaced the Jetson to the motors through a ROS node which translates keyboard inputs to motor controls. Although the waypoint follower code is not complete, the NAV2 library contains a script for creating a waypoint follower node. If future teams can interface this node with the camera and motor control nodes, then the autonomous navigation systems will be complete.

The simulated rover is not constrained by the same software limitations as the Jetson, but the available computers in the lab space do not contain proprietary hardware needed to simulate the ZED2 camera. The ART has simulated both simplified and complete models of the rover which lack the camera and added a script to move the simplified rover via the arrow keys. Updated libraries should allow future teams to quickly develop the simulated navigation system after the hardware has been successfully installed and tested. In conclusion, the ART has not completed the autonomous navigation system, but they have created a foundation for future teams to install their software.

Comments

Dr. Keith Bartels, Team Adviser

Share

COinS