Releases: Techtonique/genbooster
Releases · Techtonique/genbooster
v0.6.7
0.5.0
Stable release for Conda-forge
v0.3.0
Initial release
Genbooster
A fast gradient boosting and bagging (RandomBagClassifier, similar to RandomForestClassifier) implementation using Rust and Python. Any base learner can be employed. Base learners input features are engineered using a randomized artificial neural network layer.
For more details, see also https://www.researchgate.net/publication/386212136_Scalable_Gradient_Boosting_using_Randomized_Neural_Networks.
1 - Installation
pip install genbooster2 - Usage
2.1 - Boosting
import numpy as np
import pandas as pd
from matplotlib import pyplot as plt
from sklearn.utils.discovery import all_estimators
from sklearn.datasets import load_iris, load_breast_cancer, load_wine
from sklearn.linear_model import Ridge, RidgeCV
from sklearn.tree import ExtraTreeRegressor
from sklearn.model_selection import train_test_split
from genbooster.genboosterclassifier import BoosterClassifier
from genbooster.randombagclassifier import RandomBagClassifier
X, y = load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
clf = BoosterClassifier(base_estimator=ExtraTreeRegressor())
clf.fit(X_train, y_train)
preds = clf.predict(X_test)
print(np.mean(preds == y_test))2.2 - Bagging (RandomBagClassifier, similar to RandomForestClassifier)
clf = RandomBagClassifier(base_estimator=ExtraTreeRegressor())
clf.fit(X_train, y_train)
preds = clf.predict(X_test)
print(np.mean(preds == y_test))