Session

Technical Session III: Advanced Sensor Concepts

Abstract

Close-proximity operations are increasingly a topic of interest, where satellites manoeuvre within a very small distance of other spacecraft. A high degree of accuracy is required in estimating the relative position and orientation of the other spacecraft, in order to conduct such manoeuvres safely. Traditionally, active systems such as radar or more recently Differential GPS have been used for relative position estimation, yet these give little information on the orientation of the other satellite. Passive imaging can provide a large amount of information on the location and orientation of the Target, with high spatial resolution. Imaging requires only low-powered cameras, which can be made available on a wider range of satellites, and does not require any functionality from the other spacecraft. A robust autonomous close-range relative orientation and location (pose) estimation system is proposed, based on computer vision. Using a single image, and utilising knowledge of the Target spacecraft, an estimation of the Target’s six relative rotation and translation parameters are found from a distance in the order of 10 metres. Such position and rotation estimates over time will allow relative orbit parameter estimation, and enable close-proximity operations such as docking and remote inspection.

Share

COinS
 
Aug 22nd, 7:29 PM

Pose Estimation of Target Satellite for Proximity Operations

Close-proximity operations are increasingly a topic of interest, where satellites manoeuvre within a very small distance of other spacecraft. A high degree of accuracy is required in estimating the relative position and orientation of the other spacecraft, in order to conduct such manoeuvres safely. Traditionally, active systems such as radar or more recently Differential GPS have been used for relative position estimation, yet these give little information on the orientation of the other satellite. Passive imaging can provide a large amount of information on the location and orientation of the Target, with high spatial resolution. Imaging requires only low-powered cameras, which can be made available on a wider range of satellites, and does not require any functionality from the other spacecraft. A robust autonomous close-range relative orientation and location (pose) estimation system is proposed, based on computer vision. Using a single image, and utilising knowledge of the Target spacecraft, an estimation of the Target’s six relative rotation and translation parameters are found from a distance in the order of 10 metres. Such position and rotation estimates over time will allow relative orbit parameter estimation, and enable close-proximity operations such as docking and remote inspection.