Session

Weekday Poster Session 1

Location

Utah State University, Logan, UT

Abstract

Neural volumetric scene representations encode the color and density of points in 3D space by optimizing an underlying continuous volumetric scene function. These methods are focused on synthesizing novel views of objects and on-ground scenes from a set of sparse input views. However, they under-perform at photogrammetric tasks of geometric reconstruction from multi-view satellite imagery in environments with diffuse reflection and harsh shading. This limits the ability of these methods to accurately generate Digital Elevation Models (DEMs) of the permanently shadowed regions (PSRs) using multi-view satellite imagery. By incorporating techniques such as a learned distance function of the surface geometry and an explicit illumination model, we aim to present a new neural rendering scheme for surface reconstruction tailored specifically to the lunar environment.

SSC24-P1-19.pdf (5115 kB)

Share

COinS
 
Aug 6th, 9:00 AM

Design and Development of a Neural Surface Rendering Model for Lunar Satellite Photogrammetry

Utah State University, Logan, UT

Neural volumetric scene representations encode the color and density of points in 3D space by optimizing an underlying continuous volumetric scene function. These methods are focused on synthesizing novel views of objects and on-ground scenes from a set of sparse input views. However, they under-perform at photogrammetric tasks of geometric reconstruction from multi-view satellite imagery in environments with diffuse reflection and harsh shading. This limits the ability of these methods to accurately generate Digital Elevation Models (DEMs) of the permanently shadowed regions (PSRs) using multi-view satellite imagery. By incorporating techniques such as a learned distance function of the surface geometry and an explicit illumination model, we aim to present a new neural rendering scheme for surface reconstruction tailored specifically to the lunar environment.