Session

Session V: Guidance and Control

Abstract

The goal of this work is to demonstrate the autonomous proximity operation capabilities of a 3U scale cubesat in performing the simulated tasks of docking, charging, relative navigation, and deorbiting of space debris, as a step towards designing a fully robotic cubesat. The experiments were performed on an air-bearing testbed, using an engineering model of a 3U scale cubesat equipped with cold-gas propulsion. An appendage with a gripper is integrated into the model to enable grasping. Onboard vision and control algorithms are employed to perform precise navigation and manipulation tasks. Three experiments incorporating the tasks above have been successfully demonstrated.

Hardware: The experimental setup consists of two 3U cubesat engineering models, an air-bearing testbed, and a motion capture system. The current cubesat model is derived from a previous version that has been used to demonstrate autonomous point-to-point navigation and obstacle avoidance tasks. The cubesat model consists of the following main subsystems: 3D printed cold-gas propulsion, sensing and computing, and power. In addition, we developed and integrated an appendage with a multipurpose end effector that is effective in grasping objects, docking to, and charging a second cubesat model. The sensor suite consists of pressure sensors, an inertial measurement unit (IMU), short range IR sensors, and a camera. An Odroid XU4 computer with an octa-core processor was chosen to satisfy the computational, power, and form constraints of the model.

Software: The perception and control algorithms used for the proximity operations were developed and implemented using an open source robotics software framework called Robot Operating System (ROS) as a middleware for communication. The perception algorithm estimates the 3D pose and rate of change of the cubesat and objects of interest in its vicinity. The object detection requires a textured 3D model of objects and works by matching SURF features of a given image to those generated from the 3D model. The object tracking employs KLT tracking with outlier detection to obtain robust estimates. The textured 3D model is constructed from multi-view images, however, it can also be generated from CAD models. A state machine is employed to automatically switch between the desired control behaviors.

Experiment: The system's performance is validated through three experiments showcasing precise relative navigation, docking, and reconfiguration. The first experiment is a simple docking and reconfiguration maneuver, in which a "primary" cubesat detects and navigates to the closest face of a passive "secondary" cubesat, upon which it deploys its appendage and docks. The primary then navigates the joined system to a final goal position. In a variation of this experiment, after docking, the primary transmits power to the secondary which is indicated by an LED. The next experiment explores the scenario of debris deorbiting. Similar to the first experiment, the docking procedure is performed, followed by unlatching and release of the secondary with a desired velocity vector. In the last experiment, the primary and secondary execute relative navigation along a set path while maintaining formation.

Additional details can be found here: https://asco.lcsr.jhu.edu/nanosatellite-guidance-navigation-and-control

Share

COinS
 
Aug 7th, 9:45 AM

Laboratory Validation of Vision Based Grasping, Guidance and Control with Two Nanosatellite Models

The goal of this work is to demonstrate the autonomous proximity operation capabilities of a 3U scale cubesat in performing the simulated tasks of docking, charging, relative navigation, and deorbiting of space debris, as a step towards designing a fully robotic cubesat. The experiments were performed on an air-bearing testbed, using an engineering model of a 3U scale cubesat equipped with cold-gas propulsion. An appendage with a gripper is integrated into the model to enable grasping. Onboard vision and control algorithms are employed to perform precise navigation and manipulation tasks. Three experiments incorporating the tasks above have been successfully demonstrated.

Hardware: The experimental setup consists of two 3U cubesat engineering models, an air-bearing testbed, and a motion capture system. The current cubesat model is derived from a previous version that has been used to demonstrate autonomous point-to-point navigation and obstacle avoidance tasks. The cubesat model consists of the following main subsystems: 3D printed cold-gas propulsion, sensing and computing, and power. In addition, we developed and integrated an appendage with a multipurpose end effector that is effective in grasping objects, docking to, and charging a second cubesat model. The sensor suite consists of pressure sensors, an inertial measurement unit (IMU), short range IR sensors, and a camera. An Odroid XU4 computer with an octa-core processor was chosen to satisfy the computational, power, and form constraints of the model.

Software: The perception and control algorithms used for the proximity operations were developed and implemented using an open source robotics software framework called Robot Operating System (ROS) as a middleware for communication. The perception algorithm estimates the 3D pose and rate of change of the cubesat and objects of interest in its vicinity. The object detection requires a textured 3D model of objects and works by matching SURF features of a given image to those generated from the 3D model. The object tracking employs KLT tracking with outlier detection to obtain robust estimates. The textured 3D model is constructed from multi-view images, however, it can also be generated from CAD models. A state machine is employed to automatically switch between the desired control behaviors.

Experiment: The system's performance is validated through three experiments showcasing precise relative navigation, docking, and reconfiguration. The first experiment is a simple docking and reconfiguration maneuver, in which a "primary" cubesat detects and navigates to the closest face of a passive "secondary" cubesat, upon which it deploys its appendage and docks. The primary then navigates the joined system to a final goal position. In a variation of this experiment, after docking, the primary transmits power to the secondary which is indicated by an LED. The next experiment explores the scenario of debris deorbiting. Similar to the first experiment, the docking procedure is performed, followed by unlatching and release of the secondary with a desired velocity vector. In the last experiment, the primary and secondary execute relative navigation along a set path while maintaining formation.

Additional details can be found here: https://asco.lcsr.jhu.edu/nanosatellite-guidance-navigation-and-control