Skip to content

Commit b3daea2

Browse files
update CI workflows: streamline testing process and enhance build efficiency
1 parent 4126981 commit b3daea2

File tree

1 file changed

+135
-132
lines changed

1 file changed

+135
-132
lines changed

README.md

Lines changed: 135 additions & 132 deletions
Original file line numberDiff line numberDiff line change
@@ -6,144 +6,147 @@
66
[![Downloads](https://pepy.tech/badge/lazypredict)](https://pepy.tech/project/lazypredict)
77
[![CodeFactor](https://www.codefactor.io/repository/github/shankarpandala/lazypredict/badge)](https://www.codefactor.io/repository/github/shankarpandala/lazypredict)
88

9-
Lazy Predict helps build a lot of basic models without much code and
10-
helps understand which models works better without any parameter tuning.
9+
Lazy Predict helps build a lot of basic models without much code and helps understand which models work better without any parameter tuning.
1110

12-
- Free software: MIT license
13-
- Documentation: <https://lazypredict.readthedocs.io>.
11+
- Free software: MIT license
12+
- Documentation: <https://lazypredict.readthedocs.io>
1413

15-
# Installation
14+
## Installation
1615

1716
To install Lazy Predict:
1817

19-
pip install lazypredict
18+
```bash
19+
pip install lazypredict
20+
```
2021

21-
# Usage
22+
## Usage
2223

2324
To use Lazy Predict in a project:
2425

25-
import lazypredict
26-
27-
# Classification
28-
29-
Example :
30-
31-
from lazypredict.Supervised import LazyClassifier
32-
from sklearn.datasets import load_breast_cancer
33-
from sklearn.model_selection import train_test_split
34-
35-
data = load_breast_cancer()
36-
X = data.data
37-
y= data.target
38-
39-
X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)
40-
41-
clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)
42-
models,predictions = clf.fit(X_train, X_test, y_train, y_test)
43-
44-
print(models)
45-
46-
47-
| Model | Accuracy | Balanced Accuracy | ROC AUC | F1 Score | Time Taken |
48-
|:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|
49-
| LinearSVC | 0.989474 | 0.987544 | 0.987544 | 0.989462 | 0.0150008 |
50-
| SGDClassifier | 0.989474 | 0.987544 | 0.987544 | 0.989462 | 0.0109992 |
51-
| MLPClassifier | 0.985965 | 0.986904 | 0.986904 | 0.985994 | 0.426 |
52-
| Perceptron | 0.985965 | 0.984797 | 0.984797 | 0.985965 | 0.0120046 |
53-
| LogisticRegression | 0.985965 | 0.98269 | 0.98269 | 0.985934 | 0.0200036 |
54-
| LogisticRegressionCV | 0.985965 | 0.98269 | 0.98269 | 0.985934 | 0.262997 |
55-
| SVC | 0.982456 | 0.979942 | 0.979942 | 0.982437 | 0.0140011 |
56-
| CalibratedClassifierCV | 0.982456 | 0.975728 | 0.975728 | 0.982357 | 0.0350015 |
57-
| PassiveAggressiveClassifier | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0130005 |
58-
| LabelPropagation | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0429988 |
59-
| LabelSpreading | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0310006 |
60-
| RandomForestClassifier | 0.97193 | 0.969594 | 0.969594 | 0.97193 | 0.033 |
61-
| GradientBoostingClassifier | 0.97193 | 0.967486 | 0.967486 | 0.971869 | 0.166998 |
62-
| QuadraticDiscriminantAnalysis | 0.964912 | 0.966206 | 0.966206 | 0.965052 | 0.0119994 |
63-
| HistGradientBoostingClassifier | 0.968421 | 0.964739 | 0.964739 | 0.968387 | 0.682003 |
64-
| RidgeClassifierCV | 0.97193 | 0.963272 | 0.963272 | 0.971736 | 0.0130029 |
65-
| RidgeClassifier | 0.968421 | 0.960525 | 0.960525 | 0.968242 | 0.0119977 |
66-
| AdaBoostClassifier | 0.961404 | 0.959245 | 0.959245 | 0.961444 | 0.204998 |
67-
| ExtraTreesClassifier | 0.961404 | 0.957138 | 0.957138 | 0.961362 | 0.0270066 |
68-
| KNeighborsClassifier | 0.961404 | 0.95503 | 0.95503 | 0.961276 | 0.0560005 |
69-
| BaggingClassifier | 0.947368 | 0.954577 | 0.954577 | 0.947882 | 0.0559971 |
70-
| BernoulliNB | 0.950877 | 0.951003 | 0.951003 | 0.951072 | 0.0169988 |
71-
| LinearDiscriminantAnalysis | 0.961404 | 0.950816 | 0.950816 | 0.961089 | 0.0199995 |
72-
| GaussianNB | 0.954386 | 0.949536 | 0.949536 | 0.954337 | 0.0139935 |
73-
| NuSVC | 0.954386 | 0.943215 | 0.943215 | 0.954014 | 0.019989 |
74-
| DecisionTreeClassifier | 0.936842 | 0.933693 | 0.933693 | 0.936971 | 0.0170023 |
75-
| NearestCentroid | 0.947368 | 0.933506 | 0.933506 | 0.946801 | 0.0160074 |
76-
| ExtraTreeClassifier | 0.922807 | 0.912168 | 0.912168 | 0.922462 | 0.0109999 |
77-
| CheckingClassifier | 0.361404 | 0.5 | 0.5 | 0.191879 | 0.0170043 |
78-
| DummyClassifier | 0.512281 | 0.489598 | 0.489598 | 0.518924 | 0.0119965 |
79-
80-
# Regression
81-
82-
Example :
83-
84-
from lazypredict.Supervised import LazyRegressor
85-
from sklearn import datasets
86-
from sklearn.utils import shuffle
87-
import numpy as np
88-
89-
boston = datasets.load_boston()
90-
X, y = shuffle(boston.data, boston.target, random_state=13)
91-
X = X.astype(np.float32)
92-
93-
offset = int(X.shape[0] * 0.9)
94-
95-
X_train, y_train = X[:offset], y[:offset]
96-
X_test, y_test = X[offset:], y[offset:]
97-
98-
reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)
99-
models, predictions = reg.fit(X_train, X_test, y_train, y_test)
100-
101-
print(models)
102-
103-
104-
| Model | Adjusted R-Squared | R-Squared | RMSE | Time Taken |
105-
|:------------------------------|-------------------:|----------:|------:|-----------:|
106-
| SVR | 0.83 | 0.88 | 2.62 | 0.01 |
107-
| BaggingRegressor | 0.83 | 0.88 | 2.63 | 0.03 |
108-
| NuSVR | 0.82 | 0.86 | 2.76 | 0.03 |
109-
| RandomForestRegressor | 0.81 | 0.86 | 2.78 | 0.21 |
110-
| XGBRegressor | 0.81 | 0.86 | 2.79 | 0.06 |
111-
| GradientBoostingRegressor | 0.81 | 0.86 | 2.84 | 0.11 |
112-
| ExtraTreesRegressor | 0.79 | 0.84 | 2.98 | 0.12 |
113-
| AdaBoostRegressor | 0.78 | 0.83 | 3.04 | 0.07 |
114-
| HistGradientBoostingRegressor | 0.77 | 0.83 | 3.06 | 0.17 |
115-
| PoissonRegressor | 0.77 | 0.83 | 3.11 | 0.01 |
116-
| LGBMRegressor | 0.77 | 0.83 | 3.11 | 0.07 |
117-
| KNeighborsRegressor | 0.77 | 0.83 | 3.12 | 0.01 |
118-
| DecisionTreeRegressor | 0.65 | 0.74 | 3.79 | 0.01 |
119-
| MLPRegressor | 0.65 | 0.74 | 3.80 | 1.63 |
120-
| HuberRegressor | 0.64 | 0.74 | 3.84 | 0.01 |
121-
| GammaRegressor | 0.64 | 0.73 | 3.88 | 0.01 |
122-
| LinearSVR | 0.62 | 0.72 | 3.96 | 0.01 |
123-
| RidgeCV | 0.62 | 0.72 | 3.97 | 0.01 |
124-
| BayesianRidge | 0.62 | 0.72 | 3.97 | 0.01 |
125-
| Ridge | 0.62 | 0.72 | 3.97 | 0.01 |
126-
| TransformedTargetRegressor | 0.62 | 0.72 | 3.97 | 0.01 |
127-
| LinearRegression | 0.62 | 0.72 | 3.97 | 0.01 |
128-
| ElasticNetCV | 0.62 | 0.72 | 3.98 | 0.04 |
129-
| LassoCV | 0.62 | 0.72 | 3.98 | 0.06 |
130-
| LassoLarsIC | 0.62 | 0.72 | 3.98 | 0.01 |
131-
| LassoLarsCV | 0.62 | 0.72 | 3.98 | 0.02 |
132-
| Lars | 0.61 | 0.72 | 3.99 | 0.01 |
133-
| LarsCV | 0.61 | 0.71 | 4.02 | 0.04 |
134-
| SGDRegressor | 0.60 | 0.70 | 4.07 | 0.01 |
135-
| TweedieRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
136-
| GeneralizedLinearRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
137-
| ElasticNet | 0.58 | 0.69 | 4.16 | 0.01 |
138-
| Lasso | 0.54 | 0.66 | 4.35 | 0.02 |
139-
| RANSACRegressor | 0.53 | 0.65 | 4.41 | 0.04 |
140-
| OrthogonalMatchingPursuitCV | 0.45 | 0.59 | 4.78 | 0.02 |
141-
| PassiveAggressiveRegressor | 0.37 | 0.54 | 5.09 | 0.01 |
142-
| GaussianProcessRegressor | 0.23 | 0.43 | 5.65 | 0.03 |
143-
| OrthogonalMatchingPursuit | 0.16 | 0.38 | 5.89 | 0.01 |
144-
| ExtraTreeRegressor | 0.08 | 0.32 | 6.17 | 0.01 |
145-
| DummyRegressor | -0.38 | -0.02 | 7.56 | 0.01 |
146-
| LassoLars | -0.38 | -0.02 | 7.56 | 0.01 |
147-
| KernelRidge | -11.50 | -8.25 | 22.74 | 0.01 |
148-
149-
26+
```python
27+
import lazypredict
28+
```
29+
30+
## Classification
31+
32+
Example:
33+
34+
```python
35+
from lazypredict.Supervised import LazyClassifier
36+
from sklearn.datasets import load_breast_cancer
37+
from sklearn.model_selection import train_test_split
38+
39+
data = load_breast_cancer()
40+
X = data.data
41+
y = data.target
42+
43+
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.5, random_state=123)
44+
45+
clf = LazyClassifier(verbose=0, ignore_warnings=True, custom_metric=None)
46+
models, predictions = clf.fit(X_train, X_test, y_train, y_test)
47+
48+
print(models)
49+
```
50+
51+
| Model | Accuracy | Balanced Accuracy | ROC AUC | F1 Score | Time Taken |
52+
|:-------------------------------|-----------:|--------------------:|----------:|-----------:|-------------:|
53+
| LinearSVC | 0.989474 | 0.987544 | 0.987544 | 0.989462 | 0.0150008 |
54+
| SGDClassifier | 0.989474 | 0.987544 | 0.987544 | 0.989462 | 0.0109992 |
55+
| MLPClassifier | 0.985965 | 0.986904 | 0.986904 | 0.985994 | 0.426 |
56+
| Perceptron | 0.985965 | 0.984797 | 0.984797 | 0.985965 | 0.0120046 |
57+
| LogisticRegression | 0.985965 | 0.98269 | 0.98269 | 0.985934 | 0.0200036 |
58+
| LogisticRegressionCV | 0.985965 | 0.98269 | 0.98269 | 0.985934 | 0.262997 |
59+
| SVC | 0.982456 | 0.979942 | 0.979942 | 0.982437 | 0.0140011 |
60+
| CalibratedClassifierCV | 0.982456 | 0.975728 | 0.975728 | 0.982357 | 0.0350015 |
61+
| PassiveAggressiveClassifier | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0130005 |
62+
| LabelPropagation | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0429988 |
63+
| LabelSpreading | 0.975439 | 0.974448 | 0.974448 | 0.975464 | 0.0310006 |
64+
| RandomForestClassifier | 0.97193 | 0.969594 | 0.969594 | 0.97193 | 0.033 |
65+
| GradientBoostingClassifier | 0.97193 | 0.967486 | 0.967486 | 0.971869 | 0.166998 |
66+
| QuadraticDiscriminantAnalysis | 0.964912 | 0.966206 | 0.966206 | 0.965052 | 0.0119994 |
67+
| HistGradientBoostingClassifier | 0.968421 | 0.964739 | 0.964739 | 0.968387 | 0.682003 |
68+
| RidgeClassifierCV | 0.97193 | 0.963272 | 0.963272 | 0.971736 | 0.0130029 |
69+
| RidgeClassifier | 0.968421 | 0.960525 | 0.960525 | 0.968242 | 0.0119977 |
70+
| AdaBoostClassifier | 0.961404 | 0.959245 | 0.959245 | 0.961444 | 0.204998 |
71+
| ExtraTreesClassifier | 0.961404 | 0.957138 | 0.957138 | 0.961362 | 0.0270066 |
72+
| KNeighborsClassifier | 0.961404 | 0.95503 | 0.95503 | 0.961276 | 0.0560005 |
73+
| BaggingClassifier | 0.947368 | 0.954577 | 0.954577 | 0.947882 | 0.0559971 |
74+
| BernoulliNB | 0.950877 | 0.951003 | 0.951003 | 0.951072 | 0.0169988 |
75+
| LinearDiscriminantAnalysis | 0.961404 | 0.950816 | 0.950816 | 0.961089 | 0.0199995 |
76+
| GaussianNB | 0.954386 | 0.949536 | 0.949536 | 0.954337 | 0.0139935 |
77+
| NuSVC | 0.954386 | 0.943215 | 0.943215 | 0.954014 | 0.019989 |
78+
| DecisionTreeClassifier | 0.936842 | 0.933693 | 0.933693 | 0.936971 | 0.0170023 |
79+
| NearestCentroid | 0.947368 | 0.933506 | 0.933506 | 0.946801 | 0.0160074 |
80+
| ExtraTreeClassifier | 0.922807 | 0.912168 | 0.912168 | 0.922462 | 0.0109999 |
81+
| CheckingClassifier | 0.361404 | 0.5 | 0.5 | 0.191879 | 0.0170043 |
82+
| DummyClassifier | 0.512281 | 0.489598 | 0.489598 | 0.518924 | 0.0119965 |
83+
84+
## Regression
85+
86+
Example:
87+
88+
```python
89+
from lazypredict.Supervised import LazyRegressor
90+
from sklearn import datasets
91+
from sklearn.utils import shuffle
92+
import numpy as np
93+
94+
boston = datasets.load_boston()
95+
X, y = shuffle(boston.data, boston.target, random_state=13)
96+
X = X.astype(np.float32)
97+
98+
offset = int(X.shape[0] * 0.9)
99+
100+
X_train, y_train = X[:offset], y[:offset]
101+
X_test, y_test = X[offset:], y[offset:]
102+
103+
reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)
104+
models, predictions = reg.fit(X_train, X_test, y_train, y_test)
105+
106+
print(models)
107+
```
108+
109+
| Model | Adjusted R-Squared | R-Squared | RMSE | Time Taken |
110+
|:------------------------------|-------------------:|----------:|------:|-----------:|
111+
| SVR | 0.83 | 0.88 | 2.62 | 0.01 |
112+
| BaggingRegressor | 0.83 | 0.88 | 2.63 | 0.03 |
113+
| NuSVR | 0.82 | 0.86 | 2.76 | 0.03 |
114+
| RandomForestRegressor | 0.81 | 0.86 | 2.78 | 0.21 |
115+
| XGBRegressor | 0.81 | 0.86 | 2.79 | 0.06 |
116+
| GradientBoostingRegressor | 0.81 | 0.86 | 2.84 | 0.11 |
117+
| ExtraTreesRegressor | 0.79 | 0.84 | 2.98 | 0.12 |
118+
| AdaBoostRegressor | 0.78 | 0.83 | 3.04 | 0.07 |
119+
| HistGradientBoostingRegressor | 0.77 | 0.83 | 3.06 | 0.17 |
120+
| PoissonRegressor | 0.77 | 0.83 | 3.11 | 0.01 |
121+
| LGBMRegressor | 0.77 | 0.83 | 3.11 | 0.07 |
122+
| KNeighborsRegressor | 0.77 | 0.83 | 3.12 | 0.01 |
123+
| DecisionTreeRegressor | 0.65 | 0.74 | 3.79 | 0.01 |
124+
| MLPRegressor | 0.65 | 0.74 | 3.80 | 1.63 |
125+
| HuberRegressor | 0.64 | 0.74 | 3.84 | 0.01 |
126+
| GammaRegressor | 0.64 | 0.73 | 3.88 | 0.01 |
127+
| LinearSVR | 0.62 | 0.72 | 3.96 | 0.01 |
128+
| RidgeCV | 0.62 | 0.72 | 3.97 | 0.01 |
129+
| BayesianRidge | 0.62 | 0.72 | 3.97 | 0.01 |
130+
| Ridge | 0.62 | 0.72 | 3.97 | 0.01 |
131+
| TransformedTargetRegressor | 0.62 | 0.72 | 3.97 | 0.01 |
132+
| LinearRegression | 0.62 | 0.72 | 3.97 | 0.01 |
133+
| ElasticNetCV | 0.62 | 0.72 | 3.98 | 0.04 |
134+
| LassoCV | 0.62 | 0.72 | 3.98 | 0.06 |
135+
| LassoLarsIC | 0.62 | 0.72 | 3.98 | 0.01 |
136+
| LassoLarsCV | 0.62 | 0.72 | 3.98 | 0.02 |
137+
| Lars | 0.61 | 0.72 | 3.99 | 0.01 |
138+
| LarsCV | 0.61 | 0.71 | 4.02 | 0.04 |
139+
| SGDRegressor | 0.60 | 0.70 | 4.07 | 0.01 |
140+
| TweedieRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
141+
| GeneralizedLinearRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
142+
| ElasticNet | 0.58 | 0.69 | 4.16 | 0.01 |
143+
| Lasso | 0.54 | 0.66 | 4.35 | 0.02 |
144+
| RANSACRegressor | 0.53 | 0.65 | 4.41 | 0.04 |
145+
| OrthogonalMatchingPursuitCV | 0.45 | 0.59 | 4.78 | 0.02 |
146+
| PassiveAggressiveRegressor | 0.37 | 0.54 | 5.09 | 0.01 |
147+
| GaussianProcessRegressor | 0.23 | 0.43 | 5.65 | 0.03 |
148+
| OrthogonalMatchingPursuit | 0.16 | 0.38 | 5.89 | 0.01 |
149+
| ExtraTreeRegressor | 0.08 | 0.32 | 6.17 | 0.01 |
150+
| DummyRegressor | -0.38 | -0.02 | 7.56 | 0.01 |
151+
| LassoLars | -0.38 | -0.02 | 7.56 | 0.01 |
152+
| KernelRidge | -11.50 | -8.25 | 22.74 | 0.01 |

0 commit comments

Comments
 (0)