Abstract
A recurring challenge in bandpass thermal radiation measurement applications requiring a high degree of accuracy is the treatment of secondary radiation emitted from the filter itself. Considered is the important case of the short-wavelength channels implemented in CERES, RBI, and most other Earth radiation budget instruments. In these applications broadband radiation incident between 0.2 and 100 μm traverses a low-pass filter whose cut-off wavelength is around 5 μm. The filter material of choice is fused silica, which filters by absorption. Ideally most of the heat absorbed beyond 5 μm is conducted to the filter mount so that the filter temperature remains essentially constant during a scan cycle. In practice, however, the filter temperature must vary to some extent, resulting in time-varying emission from both faces. Because of the relatively low temperature of the filter, around 300 K, its peak emission occurs at around 10 μm. Radiation emitted from the face nearest the detector inevitably leads to a phase-delayed time-varying noise component that adds to and is indistinguishable from the radiation passed by the filter. A common solution to this problem is to introduce a second filter in series with the primary filter which intercepts and absorbs the emitted noise component, as shown in the figure.
Because space in the optical stack-up is often limited, it is desirable to optimally distribute the total available space between the two filters; i.e. to determine the optimum value of the ratio t2/(t1 + t2). A process is reported for accomplishing this based on a hybrid thermal diffusion/ray-trace model.
Optimization of Thickness Allocation Between Two Fused Silica Filters
A recurring challenge in bandpass thermal radiation measurement applications requiring a high degree of accuracy is the treatment of secondary radiation emitted from the filter itself. Considered is the important case of the short-wavelength channels implemented in CERES, RBI, and most other Earth radiation budget instruments. In these applications broadband radiation incident between 0.2 and 100 μm traverses a low-pass filter whose cut-off wavelength is around 5 μm. The filter material of choice is fused silica, which filters by absorption. Ideally most of the heat absorbed beyond 5 μm is conducted to the filter mount so that the filter temperature remains essentially constant during a scan cycle. In practice, however, the filter temperature must vary to some extent, resulting in time-varying emission from both faces. Because of the relatively low temperature of the filter, around 300 K, its peak emission occurs at around 10 μm. Radiation emitted from the face nearest the detector inevitably leads to a phase-delayed time-varying noise component that adds to and is indistinguishable from the radiation passed by the filter. A common solution to this problem is to introduce a second filter in series with the primary filter which intercepts and absorbs the emitted noise component, as shown in the figure.
Because space in the optical stack-up is often limited, it is desirable to optimally distribute the total available space between the two filters; i.e. to determine the optimum value of the ratio t2/(t1 + t2). A process is reported for accomplishing this based on a hybrid thermal diffusion/ray-trace model.