Skip to content

Commit e1e2729

Browse files
authored
Merge pull request #326 from shankarpandala/dev
Version 0.2.8 released
2 parents d59b069 + 4b65cd9 commit e1e2729

20 files changed

+720
-396
lines changed
Lines changed: 54 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
name: "CodeQL"
2+
3+
on:
4+
push:
5+
branches: [dev, ]
6+
pull_request:
7+
# The branches below must be a subset of the branches above
8+
branches: [dev]
9+
schedule:
10+
- cron: '0 3 * * 1'
11+
12+
jobs:
13+
analyze:
14+
name: Analyze
15+
runs-on: ubuntu-latest
16+
17+
steps:
18+
- name: Checkout repository
19+
uses: actions/checkout@v2
20+
with:
21+
# We must fetch at least the immediate parents so that if this is
22+
# a pull request then we can checkout the head.
23+
fetch-depth: 2
24+
25+
# If this run was triggered by a pull request event, then checkout
26+
# the head of the pull request instead of the merge commit.
27+
- run: git checkout HEAD^2
28+
if: ${{ github.event_name == 'pull_request' }}
29+
30+
# Initializes the CodeQL tools for scanning.
31+
- name: Initialize CodeQL
32+
uses: github/codeql-action/init@v1
33+
# Override language selection by uncommenting this and choosing your languages
34+
# with:
35+
# languages: go, javascript, csharp, python, cpp, java
36+
37+
# Autobuild attempts to build any compiled languages (C/C++, C#, or Java).
38+
# If this step fails, then you should remove it and run the build manually (see below)
39+
- name: Autobuild
40+
uses: github/codeql-action/autobuild@v1
41+
42+
# ℹ️ Command-line programs to run using the OS shell.
43+
# 📚 https://git.io/JvXDl
44+
45+
# ✏️ If the Autobuild fails above, remove it and uncomment the following three lines
46+
# and modify them (or add more) to build your code if your project
47+
# uses a compiled language
48+
49+
#- run: |
50+
# make bootstrap
51+
# make release
52+
53+
- name: Perform CodeQL Analysis
54+
uses: github/codeql-action/analyze@v1

.travis.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,12 +2,12 @@ os:
22
- linux
33
- osx
44
- windows
5-
5+
66
jobs:
77
allow_failures:
88
- os: osx
99
- os: windows
10-
10+
1111
language: python
1212
python:
1313
- 3.8

.vscode/settings.json

Lines changed: 0 additions & 3 deletions
This file was deleted.

AUTHORS.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,4 @@ Development Lead
1010
Contributors
1111
------------
1212

13-
None yet. Why not be the first?
13+
* Breno Batista da Silva <[email protected]>

CONTRIBUTING.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -69,6 +69,7 @@ Ready to contribute? Here's how to set up `lazypredict` for local development.
6969
$ mkvirtualenv lazypredict
7070
$ cd lazypredict/
7171
$ python setup.py develop
72+
$ pip install -r requirements_dev.txt
7273

7374
4. Create a branch for local development::
7475

HISTORY.rst

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,18 @@
22
History
33
=======
44

5+
0.2.8 (2021-02-06)
6+
------------------
7+
8+
* Removed StackingRegressor and CheckingClassifier.
9+
* Added provided_models method.
10+
* Added adjusted r-squared metric.
11+
* Added cardinality check to split categorical columns into low and high cardinality features.
12+
* Added different transformation pipeline for low and high cardinality features.
13+
* Included all number dtypes as inputs.
14+
* Fixed dependencies.
15+
* Improved documentation.
16+
517
0.2.7 (2020-07-09)
618
------------------
719

LICENSE

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
MIT License
22

3-
Copyright (c) 2019, Shankar Rao Pandala
3+
Copyright (c) 2020, Shankar Rao Pandala
44

55
Permission is hereby granted, free of charge, to any person obtaining a copy
66
of this software and associated documentation files (the "Software"), to deal

MANIFEST.in

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,8 @@ include CONTRIBUTING.rst
33
include HISTORY.rst
44
include LICENSE
55
include README.rst
6+
include requirements.txt
7+
include requirements_dev.txt
68

79
recursive-include tests *
810
recursive-exclude * __pycache__

README.rst

Lines changed: 71 additions & 44 deletions
Original file line numberDiff line numberDiff line change
@@ -17,13 +17,24 @@ Lazy Predict
1717
:target: https://pepy.tech/project/lazypredict
1818
:alt: Downloads
1919

20+
.. image:: https://www.codefactor.io/repository/github/shankarpandala/lazypredict/badge
21+
:target: https://www.codefactor.io/repository/github/shankarpandala/lazypredict
22+
:alt: CodeFactor
2023

21-
Lazy Predict help build a lot of basic models without much code and helps understand which models works better without any parameter tuning
24+
Lazy Predict helps build a lot of basic models without much code and helps understand which models works better without any parameter tuning.
2225

2326

2427
* Free software: MIT license
2528
* Documentation: https://lazypredict.readthedocs.io.
2629

30+
============
31+
Installation
32+
============
33+
34+
To install Lazy Predict::
35+
36+
pip install lazypredict
37+
2738
=====
2839
Usage
2940
=====
@@ -41,13 +52,17 @@ Example ::
4152
from lazypredict.Supervised import LazyClassifier
4253
from sklearn.datasets import load_breast_cancer
4354
from sklearn.model_selection import train_test_split
55+
4456
data = load_breast_cancer()
4557
X = data.data
4658
y= data.target
59+
4760
X_train, X_test, y_train, y_test = train_test_split(X, y,test_size=.5,random_state =123)
61+
4862
clf = LazyClassifier(verbose=0,ignore_warnings=True, custom_metric=None)
4963
models,predictions = clf.fit(X_train, X_test, y_train, y_test)
50-
models
64+
65+
print(models)
5166

5267

5368
| Model | Accuracy | Balanced Accuracy | ROC AUC | F1 Score | Time Taken |
@@ -93,54 +108,66 @@ Example ::
93108
from sklearn import datasets
94109
from sklearn.utils import shuffle
95110
import numpy as np
111+
96112
boston = datasets.load_boston()
97113
X, y = shuffle(boston.data, boston.target, random_state=13)
98114
X = X.astype(np.float32)
115+
99116
offset = int(X.shape[0] * 0.9)
117+
100118
X_train, y_train = X[:offset], y[:offset]
101119
X_test, y_test = X[offset:], y[offset:]
102-
reg = LazyRegressor(verbose=0,ignore_warnings=False, custom_metric=None )
103-
models,predictions = reg.fit(X_train, X_test, y_train, y_test)
104-
105-
106-
| Model | R-Squared | RMSE | Time Taken |
107-
|:------------------------------|------------:|---------:|-------------:|
108-
| SVR | 0.877199 | 2.62054 | 0.0330021 |
109-
| RandomForestRegressor | 0.874429 | 2.64993 | 0.0659981 |
110-
| ExtraTreesRegressor | 0.867566 | 2.72138 | 0.0570002 |
111-
| AdaBoostRegressor | 0.865851 | 2.73895 | 0.144999 |
112-
| NuSVR | 0.863712 | 2.7607 | 0.0340044 |
113-
| GradientBoostingRegressor | 0.858693 | 2.81107 | 0.13 |
114-
| KNeighborsRegressor | 0.826307 | 3.1166 | 0.0179954 |
115-
| HistGradientBoostingRegressor | 0.810479 | 3.25551 | 0.820995 |
116-
| BaggingRegressor | 0.800056 | 3.34383 | 0.0579946 |
117-
| MLPRegressor | 0.750536 | 3.73503 | 0.725997 |
118-
| HuberRegressor | 0.736973 | 3.83522 | 0.0370018 |
119-
| LinearSVR | 0.71914 | 3.9631 | 0.0179989 |
120-
| RidgeCV | 0.718402 | 3.9683 | 0.018003 |
121-
| BayesianRidge | 0.718102 | 3.97041 | 0.0159984 |
122-
| Ridge | 0.71765 | 3.9736 | 0.0149941 |
123-
| LinearRegression | 0.71753 | 3.97444 | 0.0190051 |
124-
| TransformedTargetRegressor | 0.71753 | 3.97444 | 0.012001 |
125-
| LassoCV | 0.717337 | 3.9758 | 0.0960066 |
126-
| ElasticNetCV | 0.717104 | 3.97744 | 0.0860076 |
127-
| LassoLarsCV | 0.717045 | 3.97786 | 0.0490005 |
128-
| LassoLarsIC | 0.716636 | 3.98073 | 0.0210001 |
129-
| LarsCV | 0.715031 | 3.99199 | 0.0450008 |
130-
| Lars | 0.715031 | 3.99199 | 0.0269964 |
131-
| SGDRegressor | 0.714362 | 3.99667 | 0.0210009 |
132-
| RANSACRegressor | 0.707849 | 4.04198 | 0.111998 |
133-
| ElasticNet | 0.690408 | 4.16088 | 0.0190012 |
134-
| Lasso | 0.662141 | 4.34668 | 0.0180018 |
135-
| OrthogonalMatchingPursuitCV | 0.591632 | 4.77877 | 0.0180008 |
136-
| ExtraTreeRegressor | 0.583314 | 4.82719 | 0.0129974 |
137-
| PassiveAggressiveRegressor | 0.556668 | 4.97914 | 0.0150032 |
138-
| GaussianProcessRegressor | 0.428298 | 5.65425 | 0.0580051 |
139-
| OrthogonalMatchingPursuit | 0.379295 | 5.89159 | 0.0180039 |
140-
| DecisionTreeRegressor | 0.318767 | 6.17217 | 0.0230272 |
141-
| DummyRegressor | -0.0215752 | 7.55832 | 0.0140116 |
142-
| LassoLars | -0.0215752 | 7.55832 | 0.0180008 |
143-
| KernelRidge | -8.24669 | 22.7396 | 0.0309792 |
120+
121+
reg = LazyRegressor(verbose=0, ignore_warnings=False, custom_metric=None)
122+
models, predictions = reg.fit(X_train, X_test, y_train, y_test)
123+
124+
print(models)
125+
126+
127+
| Model | Adjusted R-Squared | R-Squared | RMSE | Time Taken |
128+
|:------------------------------|-------------------:|----------:|------:|-----------:|
129+
| SVR | 0.83 | 0.88 | 2.62 | 0.01 |
130+
| BaggingRegressor | 0.83 | 0.88 | 2.63 | 0.03 |
131+
| NuSVR | 0.82 | 0.86 | 2.76 | 0.03 |
132+
| RandomForestRegressor | 0.81 | 0.86 | 2.78 | 0.21 |
133+
| XGBRegressor | 0.81 | 0.86 | 2.79 | 0.06 |
134+
| GradientBoostingRegressor | 0.81 | 0.86 | 2.84 | 0.11 |
135+
| ExtraTreesRegressor | 0.79 | 0.84 | 2.98 | 0.12 |
136+
| AdaBoostRegressor | 0.78 | 0.83 | 3.04 | 0.07 |
137+
| HistGradientBoostingRegressor | 0.77 | 0.83 | 3.06 | 0.17 |
138+
| PoissonRegressor | 0.77 | 0.83 | 3.11 | 0.01 |
139+
| LGBMRegressor | 0.77 | 0.83 | 3.11 | 0.07 |
140+
| KNeighborsRegressor | 0.77 | 0.83 | 3.12 | 0.01 |
141+
| DecisionTreeRegressor | 0.65 | 0.74 | 3.79 | 0.01 |
142+
| MLPRegressor | 0.65 | 0.74 | 3.80 | 1.63 |
143+
| HuberRegressor | 0.64 | 0.74 | 3.84 | 0.01 |
144+
| GammaRegressor | 0.64 | 0.73 | 3.88 | 0.01 |
145+
| LinearSVR | 0.62 | 0.72 | 3.96 | 0.01 |
146+
| RidgeCV | 0.62 | 0.72 | 3.97 | 0.01 |
147+
| BayesianRidge | 0.62 | 0.72 | 3.97 | 0.01 |
148+
| Ridge | 0.62 | 0.72 | 3.97 | 0.01 |
149+
| TransformedTargetRegressor | 0.62 | 0.72 | 3.97 | 0.01 |
150+
| LinearRegression | 0.62 | 0.72 | 3.97 | 0.01 |
151+
| ElasticNetCV | 0.62 | 0.72 | 3.98 | 0.04 |
152+
| LassoCV | 0.62 | 0.72 | 3.98 | 0.06 |
153+
| LassoLarsIC | 0.62 | 0.72 | 3.98 | 0.01 |
154+
| LassoLarsCV | 0.62 | 0.72 | 3.98 | 0.02 |
155+
| Lars | 0.61 | 0.72 | 3.99 | 0.01 |
156+
| LarsCV | 0.61 | 0.71 | 4.02 | 0.04 |
157+
| SGDRegressor | 0.60 | 0.70 | 4.07 | 0.01 |
158+
| TweedieRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
159+
| GeneralizedLinearRegressor | 0.59 | 0.70 | 4.12 | 0.01 |
160+
| ElasticNet | 0.58 | 0.69 | 4.16 | 0.01 |
161+
| Lasso | 0.54 | 0.66 | 4.35 | 0.02 |
162+
| RANSACRegressor | 0.53 | 0.65 | 4.41 | 0.04 |
163+
| OrthogonalMatchingPursuitCV | 0.45 | 0.59 | 4.78 | 0.02 |
164+
| PassiveAggressiveRegressor | 0.37 | 0.54 | 5.09 | 0.01 |
165+
| GaussianProcessRegressor | 0.23 | 0.43 | 5.65 | 0.03 |
166+
| OrthogonalMatchingPursuit | 0.16 | 0.38 | 5.89 | 0.01 |
167+
| ExtraTreeRegressor | 0.08 | 0.32 | 6.17 | 0.01 |
168+
| DummyRegressor | -0.38 | -0.02 | 7.56 | 0.01 |
169+
| LassoLars | -0.38 | -0.02 | 7.56 | 0.01 |
170+
| KernelRidge | -11.50 | -8.25 | 22.74 | 0.01 |
144171

145172

146173
.. warning::

build-conda-package.sh

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
#!/bin/bash
2+
3+
# change the package name to the existing PyPi package you would like to build and adjust the Python versions
4+
pkg='lazypredict'
5+
array=( 3.7 3.8 )
6+
7+
echo "Building conda package ..."
8+
cd ~
9+
conda skeleton pypi $pkg
10+
cd $pkg
11+
wget https://raw.githubusercontent.com/AnacondaRecipes/conda-feedstock/master/recipe/build.sh
12+
wget https://raw.githubusercontent.com/AnacondaRecipes/conda-feedstock/master/recipe/bld.bat
13+
cd ~
14+
15+
# building conda packages
16+
for i in "${array[@]}"
17+
do
18+
conda-build --python $i $pkg
19+
done
20+
21+
# convert package to other platforms
22+
cd ~
23+
platforms=( osx-64 linux-32 linux-64 win-32 win-64 )
24+
find $HOME/conda-bld/linux-64/ -name *.tar.bz2 | while read file
25+
do
26+
echo $file
27+
#conda convert --platform all $file -o $HOME/conda-bld/
28+
for platform in "${platforms[@]}"
29+
do
30+
conda convert --platform $platform $file -o $HOME/conda-bld/
31+
done
32+
33+
done
34+
35+
# upload packages to conda
36+
find $HOME/conda-bld/ -name *.tar.bz2 | while read file
37+
do
38+
echo $file
39+
anaconda upload $file
40+
done
41+
42+
echo "Building conda package done!"

0 commit comments

Comments
 (0)