Session

Weekend Session 1: Advanced Technologies - Research & Academia I

Location

Utah State University, Logan, UT

Abstract

Multi-view stereo photogrammetric techniques are conventionally utilized to generate Global Digital Elevation Models (GDEM) of planetary and lunar surfaces. However, these methods, relying on conventional feature detectors, are often subject to inaccuracies caused by changes in lighting conditions, including diffuse reflection and harsh shading. This has limited the ability of these methods to accurately reconstruct shadowed regions in orbital imagery, such as highly shaded urban areas and the permanently shadowed regions (PSRs) located on the lunar surface, which are critical targets for NASA’s Artemis program. Neural Radiance Fields (NeRFs) offer a novel solution to these limitations by breaking away from traditional photogrammetric assumptions of ridged, opaque surfaces. NeRFs are capable of reconstructing 3D objects with variably transmissive properties and reflective surfaces. In this summary analysis, we articulate the robustness of NeRFs in generating high-fidelity 3D models of terrain from highly shaded orbital imagery acquired from satellites in low earth orbit (LEO) and emphasize their applicability to a lunar environment. We showcase emerging NeRF-derived methods that overcome the limitations of traditional photogrammetric methods and provide a promising solution for reconstructing complex scenes in challenging lighting conditions.

Share

COinS
 
Aug 5th, 9:00 AM

A Summary of Neural Radiance Fields for Shadow Removal and Relighting of Satellite Imagery

Utah State University, Logan, UT

Multi-view stereo photogrammetric techniques are conventionally utilized to generate Global Digital Elevation Models (GDEM) of planetary and lunar surfaces. However, these methods, relying on conventional feature detectors, are often subject to inaccuracies caused by changes in lighting conditions, including diffuse reflection and harsh shading. This has limited the ability of these methods to accurately reconstruct shadowed regions in orbital imagery, such as highly shaded urban areas and the permanently shadowed regions (PSRs) located on the lunar surface, which are critical targets for NASA’s Artemis program. Neural Radiance Fields (NeRFs) offer a novel solution to these limitations by breaking away from traditional photogrammetric assumptions of ridged, opaque surfaces. NeRFs are capable of reconstructing 3D objects with variably transmissive properties and reflective surfaces. In this summary analysis, we articulate the robustness of NeRFs in generating high-fidelity 3D models of terrain from highly shaded orbital imagery acquired from satellites in low earth orbit (LEO) and emphasize their applicability to a lunar environment. We showcase emerging NeRF-derived methods that overcome the limitations of traditional photogrammetric methods and provide a promising solution for reconstructing complex scenes in challenging lighting conditions.