Skip to content

Releases: Techtonique/genbooster

v0.6.7

10 Feb 17:36

Choose a tag to compare

stable release conda forge

0.5.0

28 Jan 07:43

Choose a tag to compare

Stable release for Conda-forge

v0.3.0

24 Jan 11:41

Choose a tag to compare

Initial release

Genbooster

A fast gradient boosting and bagging (RandomBagClassifier, similar to RandomForestClassifier) implementation using Rust and Python. Any base learner can be employed. Base learners input features are engineered using a randomized artificial neural network layer.

For more details, see also https://www.researchgate.net/publication/386212136_Scalable_Gradient_Boosting_using_Randomized_Neural_Networks.

PyPI
Downloads
Documentation

1 - Installation

pip install genbooster

2 - Usage

2.1 - Boosting

import numpy as np
import pandas as pd
from matplotlib import pyplot as plt
from sklearn.utils.discovery import all_estimators
from sklearn.datasets import load_iris, load_breast_cancer, load_wine
from sklearn.linear_model import Ridge, RidgeCV
from sklearn.tree import ExtraTreeRegressor
from sklearn.model_selection import train_test_split
from genbooster.genboosterclassifier import BoosterClassifier
from genbooster.randombagclassifier import RandomBagClassifier

X, y = load_iris(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
clf = BoosterClassifier(base_estimator=ExtraTreeRegressor())
clf.fit(X_train, y_train)
preds = clf.predict(X_test)
print(np.mean(preds == y_test))

2.2 - Bagging (RandomBagClassifier, similar to RandomForestClassifier)

clf = RandomBagClassifier(base_estimator=ExtraTreeRegressor())
clf.fit(X_train, y_train)
preds = clf.predict(X_test)
print(np.mean(preds == y_test))