Product Code Database
Example Keywords: ocarina of -cap $36-138
   » » Wiki: Lightgbm
Tag Wiki 'Lightgbm'.
Tag

LightGBM, short for Light Gradient-Boosting Machine, is a free and open-source distributed gradient-boosting framework for , originally developed by . It is based on algorithms and used for ranking, classification and other machine learning tasks. The development focus is on performance and scalability.


Overview
The LightGBM framework supports different algorithms including GBT, , GBRT, GBM, MART and . LightGBM has many of 's advantages, including sparse optimization, parallel training, multiple loss functions, regularization, bagging, and early stopping. A major difference between the two lies in the construction of trees. LightGBM does not grow a tree level-wise — row by row — as most other implementations do. The Gradient Boosters IV: LightGBM – Deep & Shallow Instead it grows trees leaf-wise. It will choose the leaf with max delta loss to grow. Besides, LightGBM does not use the widely used sorted-based decision tree learning algorithm, which searches the best split point on sorted feature values, as or other implementations do. Instead, LightGBM implements a highly optimized histogram-based decision tree learning algorithm, which yields great advantages on both efficiency and memory consumption. The LightGBM algorithm utilizes two novel techniques called Gradient-Based One-Side Sampling (GOSS) and Exclusive Feature Bundling (EFB) which allow the algorithm to run faster while maintaining a high level of accuracy.

LightGBM works on , , and and supports C++, Python, R, and C#. The source code is licensed under and available on .


Gradient-based one-side sampling
When using , one thinks about the space of possible configurations of the model as a valley, in which the lowest part of the valley is the model which most closely fits the data. In this metaphor, one walks in different directions to learn how much lower the valley becomes.

Typically, in gradient descent, one uses the whole set of data to calculate the valley's slopes. However, this commonly used method assumes that every data point is equally informative.

By contrast, Gradient-Based One-Side Sampling (GOSS), a method first developed for gradient-boosted decision trees, does not rely on the assumption that all data are equally informative. Instead, it treats data points with smaller gradients (shallower slopes) as less informative by randomly dropping them. This is intended to filter out data which may have been influenced by noise, allowing the model to more accurately model the underlying relationships in the data.


Exclusive feature bundling
Exclusive feature bundling (EFB) is a near-lossless method to reduce the number of effective features. In a sparse feature space many features are nearly exclusive, implying they rarely take nonzero values simultaneously. One-hot encoded features are a perfect example of exclusive features. EFB bundles these features, reducing dimensionality to improve efficiency while maintaining a high level of accuracy. The bundle of exclusive features into a single feature is called an exclusive feature bundle.


See also


Further reading


External links

Page 1 of 1
1
Page 1 of 1
1

Account

Social:
Pages:  ..   .. 
Items:  .. 

Navigation

General: Atom Feed Atom Feed  .. 
Help:  ..   .. 
Category:  ..   .. 
Media:  ..   .. 
Posts:  ..   ..   .. 

Statistics

Page:  .. 
Summary:  .. 
1 Tags
10/10 Page Rank
5 Page Refs
1s Time