-
Notifications
You must be signed in to change notification settings - Fork 7
D. Unit testing
If you're working on the tools, you'll want to run the unit tests. The unit tests don't currently cover everything; right now, they're focused primarily on library routines and the like. We'll be enhancing them as time goes on.
We're using pytest for unit tests. If a particular package has unit
tests, it'll have a test
subdirectory. For example, there's a bdc/test
directory, indicating that there are some unit tests for bdc
.
You can use the ./run-tests.sh
script at the top of the repo to run the
unit tests. By default, it:
-
Creates a local Python virtual environment, under
.venv
at the top of the repo, if it doesn't already exist. (It uses your currently active Python, so make sure Python 3.7 is in your path.) -
Installs the build tools (via
python setup.py install
), to ensure that the tools and their dependencies are available. -
Locates all the
test
subdirectories and runspytest
in each one.
When a PR is opened or merged, the unit tests are automatically run, using Travis CI. To see the build history, visit https://travis-ci.org/databricks-edu/build-tooling.
Travis is configured using the
.travis.yml
file in the top of the repo. You can find Travis CI documentation
at https://docs.travis-ci.com. You might find the
Python project docs
to be of particular interest .
The Travis CI builds (kicked off when a PR is opened) also use
./run-tests.sh
. However, since Travis CI supplies its own virtual
environment and automatically installs the build tools, the .travis.yml
configuration file passes travis
as the only argument to ./run-tests.sh
,
which causes the script to bypass those two steps.
NOTICE
- This software is copyright © 2017-2021 Databricks, Inc., and is released under the Apache License, version 2.0. See LICENSE.txt in the main repository for details.
- Databricks cannot support this software for you. We use it internally, and we have released it as open source, for use by those who are interested in building similar kinds of Databricks notebook-based curriculum. But this software does not constitute an official Databricks product, and it is subject to change without notice.