This is a Preprint and has not been peer reviewed. This is version 3 of this Preprint.

Advancing vegetation monitoring with virtual laser scanning of dynamic scenes (VLS-4D): Opportunities, implementations and future perspectives
Downloads
Authors
Abstract
- Virtual Laser Scanning (VLS) is an established and valuable research tool in forestry and ecology, widely used to simulate labelled LiDAR point cloud data for sensitivity analysis, model training and method testing. In VLS, vegetation has traditionally been modelled as static, neglecting the influence of vegetation dynamics on LiDAR point cloud representations and limiting applications to mono-temporal analyses.
- In this review, we formalise VLS-4D, a framework that extends traditional VLS by using dynamic (i.e., 4D: 3D + time) input scenes. This advancement has opened new avenues for research on vegetation monitoring. We outline key concepts for representing dynamic scenes in LiDAR simulations, review technical implementations, and present innovative VLS-4D applications.
- We find that current simulation frameworks suitable for vegetation applications do not yet fully support dynamic scenes. While LiDAR time series of vegetation growth can be generated from static scene snapshots, simulating the effects of vegetation movement during a scan remains a challenge. We group the reviewed applications of VLS-4D into three key methodological areas: i) investigating LiDAR data acquisition and vegetation movement effects, ii) testing and validating new methods for change detection and analysis, and iii) generating labelled training data for machine and deep learning.
- We recommend that future efforts focus on extending the functionality of current LiDAR simulators and increasing the availability of open-source tools for modelling dynamic vegetation to enable more realistic simulations. Used as a complement, not a replacement, to real data, VLS-4D has the potential to significantly advance LiDAR-based vegetation monitoring by improving our understanding of point cloud representations, enabling reliable algorithm validation, and providing high-quality training data for deep learning.
DOI
https://doi.org/10.31223/X51Q5V
Subjects
Computer Sciences, Earth Sciences, Environmental Sciences, Geographic Information Sciences, Geography, Numerical Analysis and Scientific Computing, Physical and Environmental Geography, Remote Sensing
Keywords
Virtual LiDAR, LiDAR simulation, 3D animation, change analysis, multi-temporal point clouds, synthetic training data
Dates
Published: 2024-10-07 14:25
Last Updated: 2025-03-27 18:44
Older Versions
License
CC BY Attribution 4.0 International
Additional Metadata
Data Availability (Reason not available):
The input data for the \ac{lidar} simulations and the resulting point clouds used to create the figures will be made available via the institutional repository for Open Research Data from Heidelberg University upon submission.
There are no comments or no comments have been made public for this article.