Fusion of high resolution multi-spectral imagery for surface soil moisture estimation using learning machines

Location

Eccles Conference Center

Event Website

http://water.usu.edu

Start Date

4-2-2014 4:30 PM

End Date

4-2-2014 4:45 PM

Description

Many crop production management decisions by agricultural growers, production managers, or crop advisors can be informed by data from high resolution aerial images. This is because the spectral reflectance of vegetation provides an indication of the health of the plant as influenced by soil fertility, soil moisture availability, plant canopy, plant disease, and pests. However, acquisition of soil moisture values from remotely sensed data is rarely accomplished. Unmanned aerial vehicle technology and high-resolution multispectral imagery have proven to be of value for precise management of agricultural lands (precision agriculture). Activities such as vegetation canopy mapping, vegetation indices derivation, crop and soil temperature estimation, crop nitrogen estimation, and others have been demonstrated to be feasible, affordable and precise. AggieAirTM is a small, autonomous unmanned aircraft developed by the Utah Water Research Laboratory at Utah State University which carries multispectral cameras to acquire aerial imagery in the red, green, blue, near-infra red, and thermal spectra. This study reports on the development of two models using ANNs and RVMs learning machines that translate AggieAir imagery into acceptable estimations of surface soil moisture for a large field irrigated by a center pivot sprinkler system. The performance of these models appears to be good as measured by a variety of statistical parameters.

This document is currently not available here.

Share

COinS
 
Apr 2nd, 4:30 PM Apr 2nd, 4:45 PM

Fusion of high resolution multi-spectral imagery for surface soil moisture estimation using learning machines

Eccles Conference Center

Many crop production management decisions by agricultural growers, production managers, or crop advisors can be informed by data from high resolution aerial images. This is because the spectral reflectance of vegetation provides an indication of the health of the plant as influenced by soil fertility, soil moisture availability, plant canopy, plant disease, and pests. However, acquisition of soil moisture values from remotely sensed data is rarely accomplished. Unmanned aerial vehicle technology and high-resolution multispectral imagery have proven to be of value for precise management of agricultural lands (precision agriculture). Activities such as vegetation canopy mapping, vegetation indices derivation, crop and soil temperature estimation, crop nitrogen estimation, and others have been demonstrated to be feasible, affordable and precise. AggieAirTM is a small, autonomous unmanned aircraft developed by the Utah Water Research Laboratory at Utah State University which carries multispectral cameras to acquire aerial imagery in the red, green, blue, near-infra red, and thermal spectra. This study reports on the development of two models using ANNs and RVMs learning machines that translate AggieAir imagery into acceptable estimations of surface soil moisture for a large field irrigated by a center pivot sprinkler system. The performance of these models appears to be good as measured by a variety of statistical parameters.

https://digitalcommons.usu.edu/runoff/2014/2014Abstracts/37