Releases: Techtonique/nnetsauce
Releases · Techtonique/nnetsauce
Make JAX optional -- fix a few bugs
v0.51.2 reformat and bump v0.51.2
Make JAX optional
Aim of the release is to make JAX's loading optional.
In order to have JAX loaded, you'd need to type:
pip install nnetsauce[jax]v0491
v0491:
v0441
Update after hiatus
v0390
- Add quantile regression to
MTSclass : 2 ways
See examples/mts_timeseries_quantile.py
v0370
New version CustomBackPropRegressor (see https://docs.techtonique.net/nnetsauce/nnetsauce.html#CustomBackPropRegressor) and ElasticNet2Regressor (see https://docs.techtonique.net/nnetsauce/nnetsauce.html#ElasticNet2Regressor)
CustomBackPropRegressor
import nnetsauce as ns
import numpy as np
from sklearn.metrics import root_mean_squared_error
from sklearn.model_selection import train_test_split
from sklearn.datasets import load_diabetes, fetch_california_housing
from sklearn.linear_model import Ridge
from time import time
load_datasets = [load_diabetes(), fetch_california_housing()]
datasets_names = ["diabetes", "housing"]
for i, data in enumerate(load_datasets):
X = data.data
y= data.target
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = .2, random_state = 13)
regr = ns.CustomBackPropRegressor(base_model=Ridge(),
type_grad="finitediff")
start = time()
regr.fit(X_train, y_train)
preds = regr.predict(X_test)
print("Elapsed: ", time()-start)
print(f"RMSE for {datasets_names[i]} : {root_mean_squared_error(preds, y_test)}")
regr = ns.CustomBackPropRegressor(base_model=Ridge(),
type_grad="finitediff",
type_loss="quantile")
start = time()
regr.fit(X_train, y_train)
preds = regr.predict(X_test)
print("Elapsed: ", time()-start)
print(f"RMSE for {datasets_names[i]} : {root_mean_squared_error(preds, y_test)}")ElasticNet2Regressor
import nnetsauce as ns
import numpy as np
from sklearn.metrics import root_mean_squared_error
from sklearn.model_selection import train_test_split
from sklearn.datasets import load_diabetes, fetch_california_housing
from time import time
load_datasets = [load_diabetes(), fetch_california_housing()]
datasets_names = ["diabetes", "housing"]
for i, data in enumerate(load_datasets):
X = data.data
y= data.target
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = .2, random_state = 13)
regr = ns.ElasticNet2Regressor(solver="lbfgs")
start = time()
regr.fit(X_train, y_train)
preds = regr.predict(X_test)
print("Elapsed: ", time()-start)
print(f"RMSE for {datasets_names[i]} : {root_mean_squared_error(preds, y_test)}")
regr = ns.ElasticNet2Regressor(solver="cd")
start = time()
regr.fit(X_train, y_train)
preds = regr.predict(X_test)
print("Elapsed: ", time()-start)
print(f"RMSE for {datasets_names[i]} : {root_mean_squared_error(preds, y_test)}")v0360
Fix _estimator_type = "classifier"
v0353
v0.35.3 fix PredictionInterval -- fit first
v0352
- Machine Learning with ARCH effects for time series forecasting
- AIC for class
CustomRegressorandPredictionInterval
v0352
- Machine Learning with ARCH effects for time series forecasting
- AIC, AICc, BIC is class
CustomRegressor