基于核梯度提升树的森林高度估测方法

    Forest Height Estimation Method Based on Kernel Gradient Boosting Decision Tree

    • 摘要: 针对大光斑激光雷达波形数据扰动大、树高分布不均匀的问题,基于Boosting集成算法的思想,提出了一种改进的核函数——核梯度提升树(kernel gradient boosting decision tree,KeGBDT).KeGBDT通过梯度提升树叶子节点的输出值计算连接函数的权值,使用连接函数的加权作为核函数的表达形式,从而避免叶子节点中观测值分布不均匀造成的误差.在实验部分,使用星载激光雷达(geoscience laser altimeter system,GLAS)数据提取的波形特征作为森林高度估测数据集,在该数据集上将KeGBDT与核随机森林(kernel random forests,KeRF)、线性核、高斯核等常用核函数在岭回归和支持向量回归(support vector regression,SVR)算法中进行了森林高度估测对比实验.另外,基于KeGBDT的岭回归和SVR模型与线性回归、梯度提升树(gradient boosting decision tree,GBDT)、随机森林等回归算法进行了森林高度估测对比分析.实验结果表明,基于KeGBDT的回归算法在决定系数与均方根误差两方面都优于常用核函数与回归算法,可以有效减小森林高度估测模型的回归误差.

       

      Abstract: To solve the problem of the large data disturbance and high variance of spatial distribution of trees height in the waveform of large-footprint light detection and ranging (LiDAR), a kernel function, called kernel gradient boosting decision tree (KeGBDT) was introduced in this paper. The weight of the connection function through the output value of the leaf node in decision tree was calculated by KeGBDT, and the weighted sum of the connection function was used as the expression of the kernel function. Therefore, the error caused by the uneven distribution of observation values in the leaf nodes was avoided. In the experimental part, the waveform feature from the geoscience laser altimeter system (GLAS) data was used as the forest height estimation dataset. The KeGBDT was compared with kernel random forests (KeRF), linear kernel, Gaussian kernel and other common kernel functions, and the ridge regression and support vector regression were compared based on KeGBDT with the other regression algorithms, such like linear regression, GBDT and random forests, in forest height estimation task. Results show that the regression algorithm based on KeGBDT is superior to the commonly used kernel function and regression algorithm in both R-squared and root mean square error, and KeGBDT can effectively reduce the bias of the estimation model of forest height.

       

    /

    返回文章
    返回