Skip to content

Commit 75fccc3

Browse files
authored
Merge pull request #13 from alteryx/quickstart-installation_update
Documentation Updates and Relevant Links added
2 parents bd7993b + 2b81b77 commit 75fccc3

File tree

4 files changed

+94
-201
lines changed

4 files changed

+94
-201
lines changed

README.md

Lines changed: 24 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,30 @@
33
CheckMates is an Alteryx Open Source library which catches and warns of problems with your data and problem setup before modeling.
44

55
## Installation
6-
6+
```bash
7+
python -m pip install checkmates
8+
```
79
## Start
10+
#### Load and validate example data
11+
```python
12+
from checkmates import (
13+
IDColumnsDataCheck
14+
)
15+
import pandas as pd
16+
17+
id_data_check_name = IDColumnsDataCheck.name
18+
X_dict = {
19+
"col_1": [1, 1, 2, 3],
20+
"col_2": [2, 3, 4, 5],
21+
"col_3_id": [0, 1, 2, 3],
22+
"Id": [3, 1, 2, 0],
23+
"col_5": [0, 0, 1, 2],
24+
"col_6": [0.1, 0.2, 0.3, 0.4],
25+
}
26+
X = pd.DataFrame.from_dict(X_dict)
27+
id_cols_check = IDColumnsDataCheck(id_threshold=0.95)
28+
print(id_cols_check.validate(X))
29+
```
830

931
## Next Steps
1032

@@ -14,7 +36,7 @@ Read more about CheckMates on our [documentation page](#):
1436

1537
The CheckMates community is happy to provide support to users of CheckMates. Project support can be found in four places depending on the type of question:
1638
1. For usage questions, use [Stack Overflow](#) with the `CheckMates` tag.
17-
2. For bugs, issues, or feature requests start a [Github issue](#).
39+
2. For bugs, issues, or feature requests start a [Github issue](https://github.com/alteryx/CheckMates/issues).
1840
3. For discussion regarding development on the core library, use [Slack](#).
1941
4. For everything else, the core developers can be reached by email at [email protected]
2042

contributing.md

Lines changed: 9 additions & 96 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
## Contributing to the Codebase
22

3-
#### 0. Look at Open Issues
4-
We currently utilize GitHub Issues as our project management tool for datachecks. Please do the following:
5-
* Look at our [open issues](#)
3+
#### 0. Look at Open Issues
4+
We currently utilize GitHub Issues as our project management tool for checkmates. Please do the following:
5+
* Look at our [open issues](https://github.com/alteryx/CheckMates/issues)
66
* Find an unclaimed issue by looking for an empty `Assignees` field.
77
* If this is your first time contributing, issues labeled ``good first issue`` are a good place to start.
88
* If your issue is labeled `needs design` or `spike` it is recommended you provide a design document for your feature
@@ -11,19 +11,18 @@ We currently utilize GitHub Issues as our project management tool for datachecks
1111

1212

1313
#### 1. Clone repo
14-
The code is hosted on GitHub, so you will need to use Git to clone the project and make changes to the codebase. Once you have obtained a copy of the code, you should create a development environment that is separate from your existing Python environment so that you can make and test changes without compromising your own work environment. Additionally, you must make sure that the version of Python you use is at least 3.8. Using `conda` you can use `conda create -n datachecks python=3.8` and `conda activate datachecks` before the following steps.
14+
The code is hosted on GitHub, so you will need to use Git to clone the project and make changes to the codebase. Once you have obtained a copy of the code, you should create a development environment that is separate from your existing Python environment so that you can make and test changes without compromising your own work environment. Additionally, you must make sure that the version of Python you use is at least 3.8.
1515
* clone with `git clone [https://github.com/alteryx/CheckMates.git]`
1616
* install in edit mode with:
1717
```bash
1818
# move into the repo
19-
cd datachecks
19+
cd checkmates
2020
# installs the repo in edit mode, meaning changes to any files will be picked up in python. also installs all dependencies.
2121
make installdeps-dev
2222
```
2323

2424
<!--- Note that if you're on Mac, there are a few extra steps you'll want to keep track of.
25-
* In order to run on Mac, [LightGBM requires the OpenMP library to be installed](https://datachecks.alteryx.com/en/stable/install.html#Mac), which can be done with HomeBrew by running `brew install libomp`
26-
* We've seen some installs get the following warning when importing datachecks: "UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError". [A known workaround](https://stackoverflow.com/a/61531555/841003) is to run `brew reinstall readline xz` before installing the python version you're using via pyenv. If you've already installed a python version in pyenv, consider deleting it and reinstalling. v3.8.2 is known to work. --->
25+
* We've seen some installs get the following warning when importing checkmates: "UserWarning: Could not import the lzma module. Your installed Python is incomplete. Attempting to use lzma compression will result in a RuntimeError". [A known workaround](https://stackoverflow.com/a/61531555/841003) is to run `brew reinstall readline xz` before installing the python version you're using via pyenv. If you've already installed a python version in pyenv, consider deleting it and reinstalling. v3.9.7 is known to work. --->
2726
2827
#### 2. Implement your Pull Request
2928
@@ -78,12 +77,12 @@ Note that if you're building docs locally, the warning suppression code at `docs
7877

7978
* We use GitHub Actions to run our PR checkin tests. On creation of the PR and for every change you make to your PR, you'll need a maintainer to click "Approve and run" on your PR. This is a change [GitHub made in April 2021](https://github.blog/2021-04-22-github-actions-update-helping-maintainers-combat-bad-actors/).
8079

81-
* We ask that all contributors sign our contributor license agreement (CLA) the first time they contribute to datachecks. The CLA assistant will place a message on your PR; follow the instructions there to sign the CLA.
80+
* We ask that all contributors sign our contributor license agreement (CLA) the first time they contribute to checkmates. The CLA assistant will place a message on your PR; follow the instructions there to sign the CLA.
8281

8382
Add a description of your PR to the subsection that most closely matches your contribution:
84-
* Enhancements: new features or additions to DataChecks.
83+
* Enhancements: new features or additions to CheckMates.
8584
* Fixes: things like bugfixes or adding more descriptive error messages.
86-
* Changes: modifications to an existing part of DataChecks.
85+
* Changes: modifications to an existing part of CheckMates.
8786
* Documentation Changes
8887
* Testing Changes
8988

@@ -96,92 +95,6 @@ If your work includes a [breaking change](https://en.wiktionary.org/wiki/breakin
9695
* Description of your breaking change
9796
```
9897

99-
### 4. Updating our conda package
100-
101-
We maintain a conda package [package](#) to give users more options of how to install datachecks.
102-
Conda packages are created from recipes, which are yaml config files that list a package's dependencies and tests. Here is
103-
datachecks's latest published [recipe](#).
104-
GitHub repositories containing conda recipes are called `feedstocks`.
105-
106-
If you opened a PR to datachecks that modifies the packages in `dependencies` within `pyproject.toml`, or if the latest dependency bot
107-
updates the latest version of one of our packages, you will see a CI job called `build_conda_pkg`. This section describes
108-
what `build_conda_pkg` does and what to do if you see it fails in your pr.
109-
110-
#### What is build_conda_pkg?
111-
`build_conda_pkg` clones the PR branch and builds the conda package from that branch. Since the conda build process runs our
112-
entire suite of unit tests, `build_conda_pkg` checks that our conda package actually supports the proposed change of the PR.
113-
We added this check to eliminate surprises. Since the conda package is released after we release to PyPi, it's possible that
114-
we released a dependency version that is not compatible with our conda recipe. It would be a pain to try to debug this at
115-
release-time since the PyPi release includes many possible PRs that could have introduced that change.
116-
117-
#### How does `build_conda_pkg` work?
118-
`build_conda_pkg` will clone the `master` branch of the feedstock as well as you datachecks PR branch. It will
119-
then replace the recipe in the `master` branch of the feedstock with the current
120-
latest [recipe](#) in datachecks.
121-
It will also modify the [source](#)
122-
field of the local copy of the recipe and point it at the local datachecks clone of your PR branch.
123-
This has the effect of building our conda package against your PR branch!
124-
125-
#### Why does `build_conda_pkg` use a recipe in datachecks as opposed to the recipe in the feedstock `master` branch?
126-
One important fact to know about conda is that any change to the `master` branch of a feedstock will
127-
result in a new version of the conda package being published to the world!
128-
129-
With this in mind, let's say your PR requires modifying our dependencies.
130-
If we made a change to `master`, an updated version of datachecks's latest conda package would
131-
be released. This means people who installed the latest version of datachecks prior to this PR would get different dependency versions
132-
than those who installed datachecks after the PR got merged on GitHub. This is not desirable, especially because the PR would not get shipped
133-
to PyPi until the next release happens. So there would also be a discrepancy between the PyPi and conda versions.
134-
135-
By using a recipe stored in the datachecks repo, we can keep track of the changes that need to be made for the next release without
136-
having to publish a new conda package. Since the recipe is also "unique" to your PR, you are free to make whatever changes you
137-
need to make without disturbing other PRs. This would not be the case if `build_conda_pkg` ran from the `master` branch of the
138-
feedstock.
139-
140-
#### What to do if you see `build_conda_pkg` is red on your PR?
141-
It depends on the kind of PR:
142-
143-
**Case 1: You're adding a completely new dependency**
144-
145-
In this case, `build_conda_pkg` is failing simply because a dependency is missing. Adding the dependency to the recipe should
146-
make the check green. To add the dependency, modify the recipe located at `.github/meta.yaml`.
147-
148-
If you see that adding the dependency causes the build to fail, possibly because of conflicting versions, then iterate until
149-
the build passes. The team will verify if your changes make sense during PR review.
150-
151-
**Case 2: The latest dependency bot created a PR**
152-
If the latest dependency bot PR fails `build_conda_pkg`, it means our code doesn't support the latest version
153-
of one of our dependencies. This means that we either have to cap the max allowed version in our requirements file
154-
or update our code to support that version. If we opt for the former, then just like in Case 1, make the corresponding change
155-
to the recipe located at `.github/meta.yaml`
156-
157-
#### What about the `check_versions` CI check?
158-
This check verifies that the allowed versions listed in `pyproject.toml` match those listed in
159-
the conda recipe so that the PyPi requirements and conda requirements don't get out of sync.
160-
161-
## Code Style Guide
162-
163-
* Keep things simple. Any complexity must be justified in order to pass code review.
164-
* Be aware that while we love fancy python magic, there's usually a simpler solution which is easier to understand!
165-
* Make PRs as small as possible! Consider breaking your large changes into separate PRs. This will make code review easier, quicker, less bug-prone and more effective.
166-
* In the name of every branch you create, include the associated issue number if applicable.
167-
* If new changes are added to the branch you're basing your changes off of, consider using `git rebase -i base_branch` rather than merging the base branch, to keep history clean.
168-
* Always include a docstring for public methods and classes. Consider including docstrings for private methods too. We use the [Google docstring convention](https://google.github.io/styleguide/pyguide.html#38-comments-and-docstrings), and use the [`sphinx.ext.napoleon`](https://www.sphinx-doc.org/en/master/usage/extensions/napoleon.html) extension to parse our docstrings.
169-
* Although not explicitly enforced by the Google convention, keep the following stylistic conventions for docstrings in mind:
170-
- First letter of each argument description should be capitalized.
171-
- Docstring sentences should end in periods. This includes descriptions for each argument.
172-
- Types should be written in lower-case. For example, use "bool" instead of "Bool".
173-
- Always add the default value in the description of the argument, if applicable. For example, "Defaults to 1."
174-
* Use [PascalCase (upper camel case)](https://en.wikipedia.org/wiki/Camel_case#Variations_and_synonyms) for class names, and [snake_case](https://en.wikipedia.org/wiki/Snake_case) for method and class member names.
175-
* To distinguish private methods and class attributes from public ones, those which are private should be prefixed with an underscore
176-
* Any code which doesn't need to be public should be private. Use `@staticmethod` and `@classmethod` where applicable, to indicate no side effects.
177-
* Only call public methods in unit tests.
178-
* All code must have unit test coverage. Use mocking and monkey-patching when necessary.
179-
* Keep unit tests as fast as possible. In particular, avoid calling `fit`. Mocking can help with this.
180-
* When you're working with code which uses a random number generator, make sure your unit tests set a random seed.
181-
* Use `np.testing.assert_almost_equal` when comparing floating-point numbers, to avoid numerical precision issues, particularly cross-platform.
182-
* Use `os.path` tools to keep file paths cross-platform.
183-
* Our rule of thumb is to favor traditional inheritance over a mixin pattern.
184-
18598
## GitHub Issue Guide
18699

187100
* Make the title as short and descriptive as possible.

docs/source/release_notes.rst

Lines changed: 48 additions & 49 deletions
Original file line numberDiff line numberDiff line change
@@ -1,39 +1,40 @@
11
Release Notes
22
-------------
3-
.. **Future Releases**
4-
.. * Enhancements
5-
.. * Fixes
6-
.. * Changes
7-
.. * Documentation Changes
8-
.. * Testing Changes
3+
**Future Releases**
4+
* Enhancements
5+
* Fixes
6+
* Changes
7+
* Documentation Changes
8+
* Updated readme.md, contrubuting.md, and releases.md to reflect CheckMates package installation, quickstart, and useful links :pr:`13`
9+
* Testing Changes
910

1011
**v0.1.0 July 28, 2023**
11-
* Enhancements
12-
* updated pyproject to v0.1.0 for first release and added project urls :pr:`8`
13-
* added pdm.lock and .python-version to .gitignore :pr:`8`
14-
* Added repo specific token for workflows :pr:`2`
15-
* PDM Packaging ready for deployment :pr:`2`
16-
* Added testing workflow for pytest :pr:`2`
17-
* Transfer over base `Data Checks` and `IDColumnData Checks` from the `EvalML` repo :pr:`1`
18-
* Added in github workflows that are relevant to `DataChecks`, from `EvalML` repository, and modified to fit `DataChecks` wherever possible :pr:`1`
19-
* Implemented linters and have them successfully running :pr:`1`
20-
* Fixes
21-
* Cleanup files and add release workflow :pr:`6`
22-
* Fixed pytest failures :pr:`1`
23-
* Workflows are now up and running properly :pr:`1`
24-
* Changes
25-
* Irrelevant workflows removed (`minimum_dependency_checker`) :pr:`2`
26-
* Removed all `EvalML` dependencies and unnecessary functions/comments from `utils`, `tests`, `exceptions`, and `datachecks` :pr:`1`
27-
* Updated comments to reflect `DataChecks` repository :pr:`1`
28-
* Restructured file directory to categorize data checks between `datacheck_meta` and `checks` :pr:`1`
29-
* Restructured pdm packaging to only be relevant to `DataChecks`, now to be renamed to `CheckMate` :pr:`1`
30-
* Documentation Changes
31-
* Documentation refactored to now fit `CheckMates` :pr:`11`
32-
* Documentation refactored to now fit `Checkers` :pr:`4`
33-
* Documentation refactored to now fit `CheckMate` :pr:`2`
34-
* Testing Changes
35-
* Automated testing within github actions :pr:`2`
36-
* Removed integration testing due to irrelevance with `datacheck_meta` and `checks` :pr:`1`
12+
* Enhancements
13+
* updated pyproject to v0.1.0 for first release and added project urls :pr:`8`
14+
* added pdm.lock and .python-version to .gitignore :pr:`8`
15+
* Added repo specific token for workflows :pr:`2`
16+
* PDM Packaging ready for deployment :pr:`2`
17+
* Added testing workflow for pytest :pr:`2`
18+
* Transfer over base `Data Checks` and `IDColumnData Checks` from the `EvalML` repo :pr:`1`
19+
* Added in github workflows that are relevant to `DataChecks`, from `EvalML` repository, and modified to fit `DataChecks` wherever possible :pr:`1`
20+
* Implemented linters and have them successfully running :pr:`1`
21+
* Fixes
22+
* Cleanup files and add release workflow :pr:`6`
23+
* Fixed pytest failures :pr:`1`
24+
* Workflows are now up and running properly :pr:`1`
25+
* Changes
26+
* Irrelevant workflows removed (`minimum_dependency_checker`) :pr:`2`
27+
* Removed all `EvalML` dependencies and unnecessary functions/comments from `utils`, `tests`, `exceptions`, and `datachecks` :pr:`1`
28+
* Updated comments to reflect `DataChecks` repository :pr:`1`
29+
* Restructured file directory to categorize data checks between `datacheck_meta` and `checks` :pr:`1`
30+
* Restructured pdm packaging to only be relevant to `DataChecks`, now to be renamed to `CheckMate` :pr:`1`
31+
* Documentation Changes
32+
* Documentation refactored to now fit `CheckMates` :pr:`11`
33+
* Documentation refactored to now fit `Checkers` :pr:`4`
34+
* Documentation refactored to now fit `CheckMate` :pr:`2`
35+
* Testing Changes
36+
* Automated testing within github actions :pr:`2`
37+
* Removed integration testing due to irrelevance with `datacheck_meta` and `checks` :pr:`1`
3738

3839
**v0.0.2 July 26, 2023**
3940
* Enhancements
@@ -49,22 +50,20 @@ Release Notes
4950
* Automated testing within github actions :pr:`2`
5051

5152
**v0.0.1 July 18, 2023**
52-
53-
* Enhancements
54-
* Transfer over base `Data Checks` and `IDColumnData Checks` from the `EvalML` repo :pr:`1`
55-
* Added in github workflows that are relevant to `DataChecks`, from `EvalML` repository, and modified to fit `DataChecks` wherever possible :pr:`1`
56-
* Implemented linters and have them successfully running :pr:`1`
57-
* Fixes
58-
* Fixed pytest failures :pr:`1`
59-
* Workflows are now up and running properly :pr:`1`
60-
* Changes
61-
* Removed all `EvalML` dependencies and unnecessary functions/comments from `utils`, `tests`, `exceptions`, and `datachecks` :pr:`1`
62-
* Updated comments to reflect `DataChecks` repository :pr:`1`
63-
* Restructured file directory to categorize data checks between `datacheck_meta` and `checks` :pr:`1`
64-
* Restructured pdm packaging to only be relevant to `DataChecks`, now to be renamed to `CheckMate` :pr:`1`
65-
* Testing Changes
66-
* Removed integration testing due to irrelevance with `datacheck_meta` and `checks` :pr:`1`
53+
* Enhancements
54+
* Transfer over base `Data Checks` and `IDColumnData Checks` from the `EvalML` repo :pr:`1`
55+
* Added in github workflows that are relevant to `DataChecks`, from `EvalML` repository, and modified to fit `DataChecks` wherever possible :pr:`1`
56+
* Implemented linters and have them successfully running :pr:`1`
57+
* Fixes
58+
* Fixed pytest failures :pr:`1`
59+
* Workflows are now up and running properly :pr:`1`
60+
* Changes
61+
* Removed all `EvalML` dependencies and unnecessary functions/comments from `utils`, `tests`, `exceptions`, and `datachecks` :pr:`1`
62+
* Updated comments to reflect `DataChecks` repository :pr:`1`
63+
* Restructured file directory to categorize data checks between `datacheck_meta` and `checks` :pr:`1`
64+
* Restructured pdm packaging to only be relevant to `DataChecks`, now to be renamed to `CheckMate` :pr:`1`
65+
* Testing Changes
66+
* Removed integration testing due to irrelevance with `datacheck_meta` and `checks` :pr:`1`
6767

6868
**v0.0.0 July 3, 2023**
69-
70-
* *GitHub Repo Created*
69+
* *GitHub Repo Created*

0 commit comments

Comments
 (0)