#### Abstract

Frequent calibration using noise references is used to reduce the effects of time varying fluctuations that naturally occur within sensitive radiometer receivers. Over the years, calibration architectures and processing algorithms have become more sophisticated. Predicting the performance of a given calibration architecture often requires expensive prototyping of hardware; optimizing processing algorithms for nonstationary fluctuations is a challenge. Measurement uncertainty is a figure of merit by which to compare the performance although its analytical calculation can be challenging for complex architectures or adaptive algorithms. This presentation describes a numerical simulator for modeling and analysis of nonstationary time series. Applications of the simulator include: synthesizing novel radiometer architectures, analyzing adaptive calibration algorithms, evaluating means for specifying and validating receiver/amplifier stability, and developing Noise Assisted Data Analysis of time series data. The simulator includes a pseudo random noise generator that produces time series of multiple reference signals with known noise power. When used within a calibration algorithm, the stationary pseudo random series produce time-invariant statistics at the algorithm output. Fluctuations within a radiometer are simulated by modulating these pseudo reference noise series. Influence of nonstationary fluctuations can be quantified as excess uncertainty over the time-invariant statistics. For radiometer studies, the simulator has distinct advantages over building hardware prototypes. Radiometer architectures can be rapidly synthesized and adaptive algorithms can be tested using a variety of nonstationary forcing signals to model internal fluctuations of the receiver. The simulator also serves as a test bed for Ensemble Detection and Analysis. EDA is a form of NADA whereby ensemble data sets are produced by modulating signals under test with sequences of noise of various power. Processing of ensemble data sets enables novel filtering algorithms and extraction of information through statistical analyses that are otherwise not possible using a single time series.

A Numerical Simulator for Noise Calibration Studies

Frequent calibration using noise references is used to reduce the effects of time varying fluctuations that naturally occur within sensitive radiometer receivers. Over the years, calibration architectures and processing algorithms have become more sophisticated. Predicting the performance of a given calibration architecture often requires expensive prototyping of hardware; optimizing processing algorithms for nonstationary fluctuations is a challenge. Measurement uncertainty is a figure of merit by which to compare the performance although its analytical calculation can be challenging for complex architectures or adaptive algorithms. This presentation describes a numerical simulator for modeling and analysis of nonstationary time series. Applications of the simulator include: synthesizing novel radiometer architectures, analyzing adaptive calibration algorithms, evaluating means for specifying and validating receiver/amplifier stability, and developing Noise Assisted Data Analysis of time series data. The simulator includes a pseudo random noise generator that produces time series of multiple reference signals with known noise power. When used within a calibration algorithm, the stationary pseudo random series produce time-invariant statistics at the algorithm output. Fluctuations within a radiometer are simulated by modulating these pseudo reference noise series. Influence of nonstationary fluctuations can be quantified as excess uncertainty over the time-invariant statistics. For radiometer studies, the simulator has distinct advantages over building hardware prototypes. Radiometer architectures can be rapidly synthesized and adaptive algorithms can be tested using a variety of nonstationary forcing signals to model internal fluctuations of the receiver. The simulator also serves as a test bed for Ensemble Detection and Analysis. EDA is a form of NADA whereby ensemble data sets are produced by modulating signals under test with sequences of noise of various power. Processing of ensemble data sets enables novel filtering algorithms and extraction of information through statistical analyses that are otherwise not possible using a single time series.