A large-sample theory for infinitesimal gradient boosting

Archive ouverte

Dombry, Clement | Duchamps, Jean-Jil

Edité par CCSD -

36 pages. Infinitesimal gradient boosting is defined as the vanishing-learning-rate limit of the popular tree-based gradient boosting algorithm from machine learning (Dombry and Duchamps, 2021). It is characterized as the solution of a nonlinear ordinary differential equation in a infinite-dimensional function space where the infinitesimal boosting operator driving the dynamics depends on the training sample. We consider the asymptotic behavior of the model in the large sample limit and prove its convergence to a deterministic process. This infinite population limit is again characterized by a differential equation that depends on the population distribution. We explore some properties of this population limit: we prove that the dynamics makes the test error decrease and we consider its long time behavior.

Consulter en ligne

Suggestions

Du même auteur

Opening the species box: what parsimonious microscopic models of speciation have to say about macroevolution

Archive ouverte | Couvert, Élisa | CCSD

International audience. In the last two decades, lineage-based models of diversification, where species are viewed as particles that can divide (speciate) or die (become extinct) at rates depending on some evolving ...

Mutations on a random binary tree with measured boundary

Archive ouverte | Duchamps, Jean-Jil | CCSD

International audience

Trees within trees II: Nested Fragmentations

Archive ouverte | Duchamps, Jean-Jil | CCSD

37 pages, 6 figures.. Similarly as in (Blancas et al. 2018) where nested coalescent processes are studied, we generalize the definition of partition-valued homogeneous Markov fragmentation processes to the setting o...

Chargement des enrichissements...