Skip to content

Commit c531c3f

Browse files
authored
Release v0.8.2 (#139)
* Make hatch a prerequisite ([#137](#137)). In version 1.9.4, hatch has become a prerequisite for installation in the GitHub workflow for the project's main branch, due to occasional failures in `pip install hatch` that depend on the local environment. This change, which includes defining the hatch version as an environment variable and adding a new step for installing hatch with a specific version, aims to enhance the reliability of the build and testing process by eliminating potential installation issues with hatch. Users should install hatch manually before executing the Makefile, as the line `pip install hatch` has been removed from the Makefile. This change aligns with the approach taken for ucx, and users are expected to understand the requirement to install prerequisites before executing the Makefile. To contribute to this project, please install hatch using `pip install hatch`, clone the GitHub repository, and run `make dev` to start the development environment and install necessary dependencies. * support files with unicode BOM ([#138](#138)). The recent change to the open-source library introduces support for handling files with a Unicode Byte Order Mark (BOM) during file upload and download operations in Databricks Workspace. This new functionality, added to the `WorkspacePath` class, allows for easier reading of text from files with the addition of a `read_text` method. When downloading a file, if it starts with a BOM, it will be detected and used for decoding, regardless of the preferred encoding based on the system's locale. The change includes a new test function that verifies the accurate encoding and decoding of files with different types of BOM using the appropriate encoding. Despite the inability to test Databrick notebooks with a BOM due to the Databricks platform modifying the uploaded data, this change enhances support for handling files with various encodings and BOM, improving compatibility with a broader range of file formats, and ensuring more accurate handling of files with BOM.
1 parent 53b9463 commit c531c3f

File tree

2 files changed

+7
-1
lines changed

2 files changed

+7
-1
lines changed

CHANGELOG.md

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,11 @@
11
# Version changelog
22

3+
## 0.8.2
4+
5+
* Make hatch a prerequisite ([#137](https://github.com/databrickslabs/blueprint/issues/137)). In version 1.9.4, hatch has become a prerequisite for installation in the GitHub workflow for the project's main branch, due to occasional failures in `pip install hatch` that depend on the local environment. This change, which includes defining the hatch version as an environment variable and adding a new step for installing hatch with a specific version, aims to enhance the reliability of the build and testing process by eliminating potential installation issues with hatch. Users should install hatch manually before executing the Makefile, as the line `pip install hatch` has been removed from the Makefile. This change aligns with the approach taken for ucx, and users are expected to understand the requirement to install prerequisites before executing the Makefile. To contribute to this project, please install hatch using `pip install hatch`, clone the GitHub repository, and run `make dev` to start the development environment and install necessary dependencies.
6+
* support files with unicode BOM ([#138](https://github.com/databrickslabs/blueprint/issues/138)). The recent change to the open-source library introduces support for handling files with a Unicode Byte Order Mark (BOM) during file upload and download operations in Databricks Workspace. This new functionality, added to the `WorkspacePath` class, allows for easier reading of text from files with the addition of a `read_text` method. When downloading a file, if it starts with a BOM, it will be detected and used for decoding, regardless of the preferred encoding based on the system's locale. The change includes a new test function that verifies the accurate encoding and decoding of files with different types of BOM using the appropriate encoding. Despite the inability to test Databrick notebooks with a BOM due to the Databricks platform modifying the uploaded data, this change enhances support for handling files with various encodings and BOM, improving compatibility with a broader range of file formats, and ensuring more accurate handling of files with BOM.
7+
8+
39
## 0.8.1
410

511
* Fixed py3.10 compatibility for `_parts` in pathlike ([#135](https://github.com/databrickslabs/blueprint/issues/135)). The recent update to our open-source library addresses the compatibility issue with Python 3.10 in the `_parts` property of a certain type. Prior to this change, there was also a `_cparts` property that returned the same value as `_parts`, which has been removed and replaced with a direct reference to `_parts`. The `_parts` property can now be accessed via reverse equality comparison, and this change has been implemented in the `joinpath` and `__truediv__` methods as well. This enhancement improves the library's compatibility with Python 3.10 and beyond, ensuring continued functionality and stability for software engineers working with the latest Python versions.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
__version__ = "0.8.1"
1+
__version__ = "0.8.2"

0 commit comments

Comments
 (0)