Advanced airborne hyperspectral remote sensing to support forest management


CALL: 2009

DOMAIN: SR - Environmental and Earth Sciences


LAST NAME: Schlerf





START: 2010-01-01



Submitted Abstract

Forest management should encompass the many functions related to forest recourses. It requires detailed data to execute current operations, to build-up records of past activities and to predict thelong-term impacts of management decisions. In support of this, the HYPERFOREST proposal aims at providing foresters with detailed spatial explicit information on forest vitality, species composition, canopy closure, etc, based on airborne hyperspectral remote sensing data.Due to the complex nature of hyperspectral remote sensing data sets, a complete imagery preprocessing chain must be set up to perform standard corrections for radiometric, geometric and atmospheric effects which might corrupt the data. Moreover, bidirectional effects caused by the heterogeneous character of terrestrial targets are influencing the captured (airborne) hyperspectral signal. Forests are such heterogeneous surfaces and might have pronounced vegetationstructure which will also affect the accuracy of hyperspectral derived thematic products, useful in forest management practices. What is more, forest management plans cannot be effective withoutproper knowledge on forest vegetation structure.Hence, this project targets (i) the development of an advanced airborne hyperspectral imagery pre-processing chain (e.g. APEX) that considers vegetation structure effects and hence bidirectional effects on the captured signal, (ii) the delivery of a robust methodology to extractforest thematic products from this pre-processed imagery, and (iii) intensive interactions with end-users by considering their feedback facilitating the supply of tuned and more end-user orientedforest thematic products. First of all, this requires the determination of forest structure parameters (for instance crown density, vertical LAI distribution, etc) at the forest test sites (three plot locations in Flanders: Wijnendalebos, Aelmoeseneiebos, Kersselaerspleyn as indicated on the map of form 7) derived from full dendrometric inventories, fine spatial scale terrestrial and coarser scale airborneLiDAR measuring campaigns. In order to identify the most contributing structure parameters to the hyperspectral signal, radiative transfer models will be used. Reference forest canopy spectral data will be collected using field spectroradiometers in the Aelmoeseneiebos (where a measuring tower is available). Canopy leaf picking and leaf biochemical analysis (chlorophyll, dry matter and water content) will be conducted since they are crucial inputs in these radiative transfer models. The analysis of the effects of vegetation structure on hyperspectral signatures will be accomplished using a bottom-up (frog’s eye view with the terrestrial LiDAR) and top-down (bird’s eye view withthe airborne LiDAR and the terrestrial LiDAR mounted on the measuring tower) approach. The bottom-up approach initiates with implementing gradually coarser vegetation structure data (from high to less detail) – the structure which is most affecting the hyperspectral signal – in theradiative transfer models. From this analysis, the minimum required level of canopy structure info that can also be obtained from airborne LIDAR data is assessed at spatially explicit scale.Once the most contributing structure parameters are available from airborne LiDAR data, combined with its quantified effect on hyperspectral signals, a procedure can be developed to build anadvanced hyperspectral imagery pre-processing chain (for APEX data) that considers the impact of vegetation structure and its bidirectional effects on the captured signal. This procedure will be based on comparison of the original APEX signals with simulated ones from radiative transfer models with airborne derived vegetation structure parameters as inputs. Finally, a methodology based on deep belief neural networks will be developed to produce forest parameters from the r

This site uses cookies. By continuing to use this site, you agree to the use of cookies for analytics purposes. Find out more in our Privacy Statement