Session

Technical Session I: Advanced Technologies I

SSC13-I-7.pdf (713 kB)
Presentation Slides

Abstract

The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. SVGS enables proximity operations and formation flying capabilities for small satellite platforms. SVGS determines the relative position and orientation between two cubesats or other small satellites. The sensor performs pose estimation by illuminating and capturing an image of retroreflective targets mounted in a known pattern on the target spacecraft using the smartphone camera and flash. The resulting image is then processed using a modification of algorithms originally developed at NASA Marshall Space Flight Center for the Advanced Video Guidance Sensor (AVGS), which successfully flew on the Demonstration for Autonomous Rendezvous Technology (DART) and Orbital Express missions. These algorithms use simple geometric photogrammetry techniques to determine the six-degree of freedom state of one spacecraft relative to the other. All image processing and computational requirements are performed using the smartphone’s processing capabilities, and the resulting calculated relative state is then provided to the host spacecraft’s other guidance, navigation, and control subsystems.

Share

COinS
 
Aug 12th, 3:44 PM

Smartphone Video Guidance Sensor for Small Satellites

The Smartphone Video Guidance Sensor (SVGS) is a miniature, self-contained autonomous rendezvous and docking sensor developed using a commercial off the shelf Android-based smartphone. SVGS enables proximity operations and formation flying capabilities for small satellite platforms. SVGS determines the relative position and orientation between two cubesats or other small satellites. The sensor performs pose estimation by illuminating and capturing an image of retroreflective targets mounted in a known pattern on the target spacecraft using the smartphone camera and flash. The resulting image is then processed using a modification of algorithms originally developed at NASA Marshall Space Flight Center for the Advanced Video Guidance Sensor (AVGS), which successfully flew on the Demonstration for Autonomous Rendezvous Technology (DART) and Orbital Express missions. These algorithms use simple geometric photogrammetry techniques to determine the six-degree of freedom state of one spacecraft relative to the other. All image processing and computational requirements are performed using the smartphone’s processing capabilities, and the resulting calculated relative state is then provided to the host spacecraft’s other guidance, navigation, and control subsystems.