You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Before running integration tests on Azure Cloud, you must login (and clear any TOKEN authenticaton):
149
+
Before running integration tests on Azure Cloud, you must log in (and clear any TOKEN authentication):
150
150
151
151
```shell
152
152
az login
@@ -159,6 +159,7 @@ Use the following command to run the integration tests:
159
159
make integration
160
160
```
161
161
162
+
### Fixtures
162
163
We'd like to encourage you to leverage the extensive set of [pytest fixtures](https://docs.pytest.org/en/latest/explanation/fixtures.html#about-fixtures).
163
164
These fixtures follow a consistent naming pattern, starting with "make_". These functions can be called multiple
164
165
times to _create and clean up objects as needed_ for your tests. Reusing these fixtures helps maintain clean and consistent
Each integration test _must be debuggable within the free [IntelliJ IDEA (Community Edition)](https://www.jetbrains.com/idea/download)
179
+
If the fixture requires no argument and special cleanup, you can simplify the fixture from
180
+
```python
181
+
@pytest.fixture
182
+
defmake_thing(...):
183
+
definner():
184
+
...
185
+
return x
186
+
return inner
187
+
```
188
+
to:
189
+
```python
190
+
@pytest.fixture
191
+
defthing(...):
192
+
...
193
+
return x
194
+
```
195
+
196
+
### Debugging
197
+
Each integration test _must be debuggable_ within the free [IntelliJ IDEA (Community Edition)](https://www.jetbrains.com/idea/download)
179
198
with the [Python plugin (Community Edition)](https://plugins.jetbrains.com/plugin/7322-python-community-edition). If it works within
180
199
IntelliJ CE, then it would work in PyCharm. Debugging capabilities are essential for troubleshooting and diagnosing issues during
181
200
development. Please make sure that your test setup allows for easy debugging by following best practices.
@@ -235,21 +254,21 @@ Here are the example steps to submit your first contribution:
235
254
236
255
1.[Make a Fork](https://docs.github.com/en/pull-requests/collaborating-with-pull-requests/working-with-forks/fork-a-repo) from the repo
237
256
2.`git clone`
238
-
3.`git checkout main` (or `gcm` if you're using [ohmyzsh](https://ohmyz.sh/)).
239
-
4.`git pull` (or `gl` if you're using [ohmyzsh](https://ohmyz.sh/)).
240
-
5.`git checkout -b FEATURENAME` (or `gcb FEATURENAME` if you're using [ohmyzsh](https://ohmyz.sh/)).
241
-
6. .. do the work
257
+
3.`git checkout main` (or `gcm` if you're using [oh-my-zsh](https://ohmyz.sh/)).
258
+
4.`git pull` (or `gl` if you're using [oh-my-zsh](https://ohmyz.sh/)).
259
+
5.`git checkout -b FEATURENAME` (or `gcb FEATURENAME` if you're using [oh-my-zsh](https://ohmyz.sh/)).
260
+
6. ... do the work
242
261
7.`make fmt`
243
-
9... fix if any
244
-
10.`make test`
245
-
11... fix if any
246
-
12.`git commit -a`. Make sure to enter a meaningful commit message title.
247
-
13.`git push origin FEATURENAME`
248
-
14. Go to GitHub UI and create PR. Alternatively, `gh pr create` (if you have [GitHub CLI](https://cli.github.com/) installed).
262
+
8. ... fix if any
263
+
9.`make test`
264
+
10. ... fix if any
265
+
11.`git commit -a`. Make sure to enter a meaningful commit message title.
266
+
12.`git push origin FEATURENAME`
267
+
13. Go to GitHub UI and create PR. Alternatively, `gh pr create` (if you have [GitHub CLI](https://cli.github.com/) installed).
249
268
Use a meaningful pull request title because it'll appear in the release notes. Use `Resolves #NUMBER` in pull
250
269
request description to [automatically link it](https://docs.github.com/en/get-started/writing-on-github/working-with-advanced-formatting/using-keywords-in-issues-and-pull-requests#linking-a-pull-request-to-an-issue)
Copy file name to clipboardExpand all lines: README.md
+2-1Lines changed: 2 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -70,7 +70,7 @@ See [contributing instructions](CONTRIBUTING.md) to help improve this project.
70
70
- Databricks Workspace Administrator privileges for the user, that runs the installation. Running UCX as a Service Principal is not supported.
71
71
- Account level Identity Setup. See instructions for [AWS](https://docs.databricks.com/en/administration-guide/users-groups/best-practices.html), [Azure](https://learn.microsoft.com/en-us/azure/databricks/administration-guide/users-groups/best-practices), and [GCP](https://docs.gcp.databricks.com/administration-guide/users-groups/best-practices.html).
72
72
- Unity Catalog Metastore Created (per region). See instructions for [AWS](https://docs.databricks.com/en/data-governance/unity-catalog/create-metastore.html), [Azure](https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/create-metastore), and [GCP](https://docs.gcp.databricks.com/data-governance/unity-catalog/create-metastore.html).
73
-
- If your Databricks Workspace relies on an external Hive Metastore (such as AWS Glue), make sure to read the [this guide](docs/external_hms_glue.md).
73
+
- If your Databricks Workspace relies on an external Hive Metastore (such as AWS Glue), make sure to read [this guide](docs/external_hms_glue.md).
74
74
- Databricks Workspace has to have network access to [pypi.org](https://pypi.org) to download `databricks-sdk` and `pyyaml` packages.
75
75
- A PRO or Serverless SQL Warehouse to render the [report](docs/assessment.md) for the [assessment workflow](#assessment-workflow).
76
76
@@ -422,6 +422,7 @@ access the configuration file from the command line. Here's the description of c
422
422
*`spark_conf`: An optional dictionary of Spark configuration properties.
423
423
*`override_clusters`: An optional dictionary mapping job cluster names to existing cluster IDs.
424
424
*`policy_id`: An optional string representing the ID of the cluster policy.
425
+
*`is_terraform_used`: A boolean value indicating whether some workspace resources are managed by Terraform.
425
426
*`include_databases`: An optional list of strings representing the names of databases to include for migration.
0 commit comments