Everyday data scientists and machine learning experts try to improve algorithms that will improve accuracy and provide better results, some succeed and some fail. In this article we will discuss one of the most successful machine learning algorithms called Light GBM
What is LightGBM?
‘Fast, distributed, high performance and gradient boosting’, when we mix all these together it creates a robust framework called LightGBM. It is based on a decision tree algorithm which is used for classification, ranking and many other machine learning tasks.
It works slightly differently from other tree based algorithms like while other algorithms grows trees horizontally, hand Light GBM grows the tree leaf by leaf by choosing to grow the leaf with the maximum delta loss.
It is a class of models called gradient boosters and the reason why is has 'Light' as a prefix is because of the feature of being high speed. Light GBM is capable of handling large data sets and utilizing less memory
LightGBM is often compared with XGBOOST and at times it is gets really tough which one to choose. LightGBM has proved it worth when used on many public datasets to be faster in processing larger databases compared to the later but the prediction is almost same with XGBOOST.
Some great advantages of using Light GBM:
- It has a faster training speed with higher efficiencies
- It uses lower memory to process
- It gives better prediction accuracy
- Supports GPU and parallel learning
- Large scale data can be handled well.
As we accumulate more data it becomes important to have a fast and efficient processing methods. This is the main reason that Light GMB has started to gain popularity. But take note, it is not advised to use LightGBM on small data sets for fear of overfitting.
LightGBM is easy to implement with some knowledge of basic parameters. Below is how to install Light GBM your PC:
- conda install -c conda-forge lightgbm
Below is a simple Python lightgbm code:
Hope you liked it. Follow me for more
If you found this Article interesting, why not review the other Articles in our archive.