Analysis and implementation of regression trees
Report Bug
Table of Contents
- Theoretical analysis of regression trees and tree-based estimation methods such as CART, Random Forest (bagging) and Gradient Descent (boosting);
- Implementation of regression trees in R and comparison of performances obtained through the use of different libraries. Regression / bagging trees: rpart, ipred, caret; Bagging - Random Forest: randomForest, ranger, H2o; Boosting - Gradient Descent: gbm, xgboost.
Regression trees/bagging: rpart, ipred, caret;
Bagging - Random Forest: randomForest, ranger, H2o;
Boosting - Gradient Descent: gbm, xgboost.
- R
- [R Studio][]
- RStudio
- Clone the repo
git clone https://github.com/ClaudioPoli/Regression-trees.git
- That's it!
To run the project, simply import the .Rproj file into R Studio and proceed with the compilation and execution
Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Don't forget to give the project a star! Thanks again!
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature
) - Commit your Changes (
git commit -m 'Add some AmazingFeature'
) - Push to the Branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
Distributed under the MIT License. See LICENSE.txt
for more information.