LI Jiangeng, ZHANG Yin, LIU Yingchun. Forest Height Estimation Method Based on Kernel Gradient Boosting Decision Tree[J]. Journal of Beijing University of Technology, 2021, 47(10): 1113-1121. DOI: 10.11936/bjutxb2019100013
    Citation: LI Jiangeng, ZHANG Yin, LIU Yingchun. Forest Height Estimation Method Based on Kernel Gradient Boosting Decision Tree[J]. Journal of Beijing University of Technology, 2021, 47(10): 1113-1121. DOI: 10.11936/bjutxb2019100013

    Forest Height Estimation Method Based on Kernel Gradient Boosting Decision Tree

    • To solve the problem of the large data disturbance and high variance of spatial distribution of trees height in the waveform of large-footprint light detection and ranging (LiDAR), a kernel function, called kernel gradient boosting decision tree (KeGBDT) was introduced in this paper. The weight of the connection function through the output value of the leaf node in decision tree was calculated by KeGBDT, and the weighted sum of the connection function was used as the expression of the kernel function. Therefore, the error caused by the uneven distribution of observation values in the leaf nodes was avoided. In the experimental part, the waveform feature from the geoscience laser altimeter system (GLAS) data was used as the forest height estimation dataset. The KeGBDT was compared with kernel random forests (KeRF), linear kernel, Gaussian kernel and other common kernel functions, and the ridge regression and support vector regression were compared based on KeGBDT with the other regression algorithms, such like linear regression, GBDT and random forests, in forest height estimation task. Results show that the regression algorithm based on KeGBDT is superior to the commonly used kernel function and regression algorithm in both R-squared and root mean square error, and KeGBDT can effectively reduce the bias of the estimation model of forest height.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return