Skip to content

Commit e30d1ad

Browse files
Fix codec error in md (#2234)
## Changes - replaced curly quotes by straight quotes. ### Linked issues curly quotes creates a codification error in windows machines. ![image](https://github.com/user-attachments/assets/cae7adfd-45c7-48c1-8038-c3ba2e42f136) --------- Co-authored-by: Cor <[email protected]>
1 parent 087611a commit e30d1ad

File tree

5 files changed

+19
-19
lines changed

5 files changed

+19
-19
lines changed

README.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1369,7 +1369,7 @@ NO_MATCHING_DISTRIBUTION_ERROR.
13691369

13701370
**Solution:** Version 0.24.0 of UCX supports workspace with no internet
13711371
access. Please upgrade UCX and rerun the installation. Reply *yes* to
1372-
the question Does the given workspace block Internet access? asked
1372+
the question "Does the given workspace block Internet access?" asked
13731373
during installation. It will then upload all necessary dependencies to
13741374
the workspace. Also, please note that UCX uses both UC and non-UC
13751375
enabled clusters. If you have different proxy settings for each, then
@@ -1423,13 +1423,13 @@ Admin privileges required for commands:
14231423

14241424
**Python:** UCX needs Python version 3.10 or later.
14251425

1426-
**Solution:** Check the current version using python --version. If the
1426+
**Solution:** Check the current version using `python --version`. If the
14271427
version is lower than 3.10, upgrade the local Python version to 3.10 or
14281428
higher.
14291429

14301430
**Databricks CLI:** Databricks CLI v0.213 or higher is needed.
14311431

1432-
**Solution:** Check the current version with databricks --version. For
1432+
**Solution:** Check the current version with `databricks --version`. For
14331433
lower versions of CLI,
14341434
[<u>update</u>](https://docs.databricks.com/en/dev-tools/cli/install.html#update)
14351435
the Databricks CLI on the local machine.
@@ -1443,8 +1443,8 @@ UCX version.
14431443
**Solution:** Upgrade UCX, and rerun the assessment job before running
14441444
the migration workflows. For some reason, if you want to install a
14451445
specific version of UCX, you can do it using the command
1446-
databricks labs install ucx@\<version\>, for example,
1447-
databricks labs install [email protected].
1446+
`databricks labs install ucx@\<version\>`, for example,
1447+
`databricks labs install [email protected]`.
14481448

14491449
[[back to top](#databricks-labs-ucx)]
14501450

@@ -1459,8 +1459,8 @@ profiles, and tokens.
14591459
**Account Level:** Not only workspace, but account level authentication
14601460
is also needed for installing UCX. If you do not have an account
14611461
configured in .databrickscfg, you will get an error message
1462-
.databrickscfg does not contain account profiles; please create one
1463-
first.
1462+
".databrickscfg does not contain account profiles; please create one
1463+
first".
14641464

14651465
**Solution:** To authenticate with a Databricks account, consider using
14661466
one of the following authentication types: [<u>OAuth machine-to-machine
@@ -1482,17 +1482,17 @@ workspace.
14821482

14831483
**Solution:** The Databricks CLI provides an option to select the
14841484
[<u>profile</u>](https://docs.databricks.com/en/dev-tools/cli/profiles.html)
1485-
using *--profile \<profile_name\>* or *-p \<profile_name\>*. You can
1485+
using `--profile \<profile_name\>` or `-p \<profile_name\>`. You can
14861486
test that the correct workspace is getting selected by running any
1487-
Databricks CLI command. For example, you can run *databricks clusters
1488-
list -p prod* and check that the Prod clusters are being returned. Once
1487+
Databricks CLI command. For example, you can run `databricks clusters
1488+
list -p prod` and check that the Prod clusters are being returned. Once
14891489
the profile is verified, you can run UCX install for that specific
1490-
profile: *databricks labs install ucx -p prod*.
1490+
profile: `databricks labs install ucx -p prod`.
14911491

14921492
**Account Level:** Multiple account level profiles are set in the
14931493
.databrickscfg file.
14941494

1495-
**Solution:** The installation command *databricks labs install ucx*
1495+
**Solution:** The installation command `databricks labs install ucx`
14961496
will provide an option to select one account profile.
14971497

14981498
[[back to top](#databricks-labs-ucx)]
@@ -1529,7 +1529,7 @@ under Data Access Configuration of SQL Warehouse Admin Settings.
15291529

15301530
### Verify the Installation
15311531

1532-
Once the UCX command *databricks labs install ucx* has completed
1532+
Once the UCX command `databricks labs install ucx` has completed
15331533
successfully, the installation can be verified with the following steps:
15341534

15351535
1. Go to the Databricks Catalog Explorer and check if a new schema for

docs/assessment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -853,7 +853,7 @@ The `.sequenceFile` pattern was found. Use "Assigned" access mode compute or mak
853853
### AF313.18 - SparkContext ( `.setJobGroup` )
854854

855855
The `.setJobGroup` pattern was found.
856-
`spark.addTag()` can attach a tag, and `getTags()` and `interruptTag(tag)` can be used to act upon the presence/absence of a tag. These APIs only work with Spark Connect (Shared Compute Mode) and will not work in Assigned access mode.
856+
`spark.addTag()` can attach a tag, and `getTags()` and `interruptTag(tag)` can be used to act upon the presence/absence of a tag. These APIs only work with Spark Connect (Shared Compute Mode) and will not work in "Assigned" access mode.
857857

858858
[[back to top](#migration-assessment-report)]
859859

src/databricks/labs/ucx/queries/assessment/estimates/00_0_metastore_assignment.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -18,4 +18,4 @@ If you haven't created a metastore yet, follow the docs below to attach your wor
1818
If any incompatible submit runs has been detected, follow the steps highlighted below:
1919

2020
1. Find out the incompatible jobs in your local orchestrator based on the object_id identified by UCX
21-
2. Change the job configuration to include the following in the ClusterInfo: data_security_mode”: “NONE”
21+
2. Change the job configuration to include the following in the ClusterInfo: "data_security_mode": "NON"

src/databricks/labs/ucx/source_code/path_lookup.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ def change_directory(self, new_working_directory: Path) -> PathLookup:
3535
def resolve(self, path: Path) -> Path | None:
3636
try:
3737
if path.is_absolute() and path.exists():
38-
# eliminate “..” components
38+
# eliminate ".." components
3939
return path.resolve()
4040
except PermissionError:
4141
logger.warning(f"Permission denied to access {path}")
@@ -55,7 +55,7 @@ def _resolve_in_library_root(self, library_root: Path, path: Path) -> Path | Non
5555
return None
5656
absolute_path = library_root / path
5757
if absolute_path.exists():
58-
return absolute_path.resolve() # eliminate “..” components
58+
return absolute_path.resolve() # eliminate ".." components
5959
return self._resolve_egg_in_library_root(library_root, path)
6060

6161
def _resolve_egg_in_library_root(self, library: Path, path: Path) -> Path | None:
@@ -64,7 +64,7 @@ def _resolve_egg_in_library_root(self, library: Path, path: Path) -> Path | None
6464
continue
6565
absolute_path = child / path
6666
if absolute_path.exists():
67-
return absolute_path.resolve() # eliminate “..” components
67+
return absolute_path.resolve() # eliminate ".." components
6868
return None
6969

7070
@staticmethod

tests/unit/source_code/samples/01_HL7Streaming.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -216,6 +216,6 @@ for (s <- spark.streams.active) {
216216

217217
// MAGIC %md
218218
// MAGIC ## Disclaimers
219-
// MAGIC Databricks Inc. (Databricks) does not dispense medical, diagnosis, or treatment advice. This Solution Accelerator (tool) is for informational purposes only and may not be used as a substitute for professional medical advice, treatment, or diagnosis. This tool may not be used within Databricks to process Protected Health Information (PHI) as defined in the Health Insurance Portability and Accountability Act of 1996, unless you have executed with Databricks a contract that allows for processing PHI, an accompanying Business Associate Agreement (BAA), and are running this notebook within a HIPAA Account. Please note that if you run this notebook within Azure Databricks, your contract with Microsoft applies.
219+
// MAGIC Databricks Inc. ("Databricks") does not dispense medical, diagnosis, or treatment advice. This Solution Accelerator ("tool") is for informational purposes only and may not be used as a substitute for professional medical advice, treatment, or diagnosis. This tool may not be used within Databricks to process Protected Health Information ("PHI") as defined in the Health Insurance Portability and Accountability Act of 1996, unless you have executed with Databricks a contract that allows for processing PHI, an accompanying Business Associate Agreement (BAA), and are running this notebook within a HIPAA Account. Please note that if you run this notebook within Azure Databricks, your contract with Microsoft applies.
220220
// MAGIC
221221
// MAGIC All names have been synthetically generated, and do not map back to any actual persons or locations

0 commit comments

Comments
 (0)