Session

Technical Session X: Communications

Abstract

The Galileo mission to Jupiter is now faced with extreme limitations in its communication downlink capability as a result of the inability of ground controllers to open its main antenna. Consequently, the rate that the Galileo spacecraft will be able to acquire data will exceed its capability to transmit it back by several orders of magnitude. Several proposed lowcost missions to go to the outer planets are faced with similar constraints. Because of the very large disparity between collection data rate and downlink data rate for such missions, data must be accumulated in mass memory for later transmission. This paper addresses the situation when the total transmitted bits (downlink data rate x total downlink time) is far less than the accumulated data bits stored in mass memory. This is a data compression problem in which the data set is all the stored data (which may be from multiple sources) and the goal is to maximize the total value of all returned data using a specified (but limited) number of bits. Typically, this problem is not simply a manyfold duplication of the more familiar compression problem to maximize the value (quality) of smaller data sets (e.g., single images or subsets of images). Instead, a basic assumption is that almost everything that matters can vary over the span of, and the period of communication of, stored data: data characteristics, user priorities, data rate, fidelity requirements, scientific value, etc. This paper concentrates on stored data bases containing primarily image data, which has historically been the dominant data source for deep-space missions. Practical "Global rate allocation and control" strategies are developed which basically tie together all the data compression operations that might be performed on all subsets of the stored data, such that a fixed number of bits are used overall. The control structures developed here are not static, allowing for continuous adjustments during communication to accommodate variations in compression performance, unexpected changes to data characteristics, autonomous discovery such as from pattern recognition and feature extraction, and user intervention. In doing so, these strategies attempt to reassign the distribution of unused bits to data subsets where they will do the most good. Many present day compression algorithms can fit directly within this rate control structure will little modification.

Share

COinS
 
Sep 1st, 1:59 PM

Global Rate Allocation and Control

The Galileo mission to Jupiter is now faced with extreme limitations in its communication downlink capability as a result of the inability of ground controllers to open its main antenna. Consequently, the rate that the Galileo spacecraft will be able to acquire data will exceed its capability to transmit it back by several orders of magnitude. Several proposed lowcost missions to go to the outer planets are faced with similar constraints. Because of the very large disparity between collection data rate and downlink data rate for such missions, data must be accumulated in mass memory for later transmission. This paper addresses the situation when the total transmitted bits (downlink data rate x total downlink time) is far less than the accumulated data bits stored in mass memory. This is a data compression problem in which the data set is all the stored data (which may be from multiple sources) and the goal is to maximize the total value of all returned data using a specified (but limited) number of bits. Typically, this problem is not simply a manyfold duplication of the more familiar compression problem to maximize the value (quality) of smaller data sets (e.g., single images or subsets of images). Instead, a basic assumption is that almost everything that matters can vary over the span of, and the period of communication of, stored data: data characteristics, user priorities, data rate, fidelity requirements, scientific value, etc. This paper concentrates on stored data bases containing primarily image data, which has historically been the dominant data source for deep-space missions. Practical "Global rate allocation and control" strategies are developed which basically tie together all the data compression operations that might be performed on all subsets of the stored data, such that a fixed number of bits are used overall. The control structures developed here are not static, allowing for continuous adjustments during communication to accommodate variations in compression performance, unexpected changes to data characteristics, autonomous discovery such as from pattern recognition and feature extraction, and user intervention. In doing so, these strategies attempt to reassign the distribution of unused bits to data subsets where they will do the most good. Many present day compression algorithms can fit directly within this rate control structure will little modification.