Skip to content

Commit 4999a4f

Browse files
authored
Added testing fixtures guidelines to CONTRIBUTING.md (#1138)
## Changes - Add some details about writing testing fixtures to CONTRIBUTING.md - Minor formatting changes
1 parent 8c59632 commit 4999a4f

File tree

2 files changed

+35
-15
lines changed

2 files changed

+35
-15
lines changed

CONTRIBUTING.md

Lines changed: 33 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -146,7 +146,7 @@ databricks users create --active --display-name "test-user-1" --user-name "first
146146
databricks users create --active --display-name "test-user-2" --user-name "[email protected]"
147147
```
148148

149-
Before running integration tests on Azure Cloud, you must login (and clear any TOKEN authenticaton):
149+
Before running integration tests on Azure Cloud, you must log in (and clear any TOKEN authentication):
150150

151151
```shell
152152
az login
@@ -159,6 +159,7 @@ Use the following command to run the integration tests:
159159
make integration
160160
```
161161

162+
### Fixtures
162163
We'd like to encourage you to leverage the extensive set of [pytest fixtures](https://docs.pytest.org/en/latest/explanation/fixtures.html#about-fixtures).
163164
These fixtures follow a consistent naming pattern, starting with "make_". These functions can be called multiple
164165
times to _create and clean up objects as needed_ for your tests. Reusing these fixtures helps maintain clean and consistent
@@ -175,7 +176,25 @@ def test_secret_scope_acl(make_secret_scope, make_secret_scope_acl, make_group):
175176
make_secret_scope_acl(scope=scope_name, principal=make_group().display_name, permission=AclPermission.WRITE)
176177
```
177178

178-
Each integration test _must be debuggable within the free [IntelliJ IDEA (Community Edition)](https://www.jetbrains.com/idea/download)
179+
If the fixture requires no argument and special cleanup, you can simplify the fixture from
180+
```python
181+
@pytest.fixture
182+
def make_thing(...):
183+
def inner():
184+
...
185+
return x
186+
return inner
187+
```
188+
to:
189+
```python
190+
@pytest.fixture
191+
def thing(...):
192+
...
193+
return x
194+
```
195+
196+
### Debugging
197+
Each integration test _must be debuggable_ within the free [IntelliJ IDEA (Community Edition)](https://www.jetbrains.com/idea/download)
179198
with the [Python plugin (Community Edition)](https://plugins.jetbrains.com/plugin/7322-python-community-edition). If it works within
180199
IntelliJ CE, then it would work in PyCharm. Debugging capabilities are essential for troubleshooting and diagnosing issues during
181200
development. Please make sure that your test setup allows for easy debugging by following best practices.
@@ -235,21 +254,21 @@ Here are the example steps to submit your first contribution:
235254

236255
1. [Make a Fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo) from the repo
237256
2. `git clone`
238-
3. `git checkout main` (or `gcm` if you're using [ohmyzsh](https://ohmyz.sh/)).
239-
4. `git pull` (or `gl` if you're using [ohmyzsh](https://ohmyz.sh/)).
240-
5. `git checkout -b FEATURENAME` (or `gcb FEATURENAME` if you're using [ohmyzsh](https://ohmyz.sh/)).
241-
6. .. do the work
257+
3. `git checkout main` (or `gcm` if you're using [oh-my-zsh](https://ohmyz.sh/)).
258+
4. `git pull` (or `gl` if you're using [oh-my-zsh](https://ohmyz.sh/)).
259+
5. `git checkout -b FEATURENAME` (or `gcb FEATURENAME` if you're using [oh-my-zsh](https://ohmyz.sh/)).
260+
6. ... do the work
242261
7. `make fmt`
243-
9. .. fix if any
244-
10. `make test`
245-
11. .. fix if any
246-
12. `git commit -a`. Make sure to enter a meaningful commit message title.
247-
13. `git push origin FEATURENAME`
248-
14. Go to GitHub UI and create PR. Alternatively, `gh pr create` (if you have [GitHub CLI](https://cli.github.com/) installed).
262+
8. ... fix if any
263+
9. `make test`
264+
10. ... fix if any
265+
11. `git commit -a`. Make sure to enter a meaningful commit message title.
266+
12. `git push origin FEATURENAME`
267+
13. Go to GitHub UI and create PR. Alternatively, `gh pr create` (if you have [GitHub CLI](https://cli.github.com/) installed).
249268
Use a meaningful pull request title because it'll appear in the release notes. Use `Resolves #NUMBER` in pull
250269
request description to [automatically link it](https://docs.github.com/en/get-started/writing-on-github/working-with-advanced-formatting/using-keywords-in-issues-and-pull-requests#linking-a-pull-request-to-an-issue)
251270
to an existing issue.
252-
15. announce PR for the review
271+
14. Announce PR for the review.
253272

254273
## Troubleshooting
255274

@@ -328,6 +347,6 @@ $ python3.10 -m pip install hatch
328347
$ make dev
329348
$ make test
330349
```
331-
Note: Before performing a clean install deactivate the virtual environment and follow the commands given above.
350+
Note: Before performing a clean installation, deactivate the virtual environment and follow the commands given above.
332351

333352
Note: The initial `hatch env show` is just to list the environments managed by Hatch and is not needed.

README.md

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ See [contributing instructions](CONTRIBUTING.md) to help improve this project.
7070
- Databricks Workspace Administrator privileges for the user, that runs the installation. Running UCX as a Service Principal is not supported.
7171
- Account level Identity Setup. See instructions for [AWS](https://docs.databricks.com/en/administration-guide/users-groups/best-practices.html), [Azure](https://learn.microsoft.com/en-us/azure/databricks/administration-guide/users-groups/best-practices), and [GCP](https://docs.gcp.databricks.com/administration-guide/users-groups/best-practices.html).
7272
- Unity Catalog Metastore Created (per region). See instructions for [AWS](https://docs.databricks.com/en/data-governance/unity-catalog/create-metastore.html), [Azure](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastore), and [GCP](https://docs.gcp.databricks.com/data-governance/unity-catalog/create-metastore.html).
73-
- If your Databricks Workspace relies on an external Hive Metastore (such as AWS Glue), make sure to read the [this guide](docs/external_hms_glue.md).
73+
- If your Databricks Workspace relies on an external Hive Metastore (such as AWS Glue), make sure to read [this guide](docs/external_hms_glue.md).
7474
- Databricks Workspace has to have network access to [pypi.org](https://pypi.org) to download `databricks-sdk` and `pyyaml` packages.
7575
- A PRO or Serverless SQL Warehouse to render the [report](docs/assessment.md) for the [assessment workflow](#assessment-workflow).
7676

@@ -422,6 +422,7 @@ access the configuration file from the command line. Here's the description of c
422422
* `spark_conf`: An optional dictionary of Spark configuration properties.
423423
* `override_clusters`: An optional dictionary mapping job cluster names to existing cluster IDs.
424424
* `policy_id`: An optional string representing the ID of the cluster policy.
425+
* `is_terraform_used`: A boolean value indicating whether some workspace resources are managed by Terraform.
425426
* `include_databases`: An optional list of strings representing the names of databases to include for migration.
426427

427428
[[back to top](#databricks-labs-ucx)]

0 commit comments

Comments
 (0)