Skip to content

Commit f73a13d

Browse files
committed
update changelog
1 parent caf3168 commit f73a13d

File tree

1 file changed

+40
-1
lines changed

1 file changed

+40
-1
lines changed

DESCRIPTION.md

Lines changed: 40 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,8 @@ Source code is also available at: https://github.com/snowflakedb/snowflake-conne
88

99
# Release Notes
1010

11-
- v3.1.0(July 31,2023)
11+
12+
- v3.1.0(July 31, 2023)
1213

1314
- Added a feature that lets you add connection definitions to the `connections.toml` configuration file. A connection definition refers to a collection of connection parameters, for example, if you wanted to define a connection named `prod``:
1415

@@ -46,6 +47,44 @@ Source code is also available at: https://github.com/snowflakedb/snowflake-conne
4647
- Improved retry logic for okta authentication to refresh token if authentication gets throttled.
4748
- Note that this release does not include the changes introduced in the previous 3.1.0a1 release. Those will be released at a later time.
4849

50+
- v3.1.0a1(July 24, 2023)
51+
- Version 3.1.0a1 is our first efforts to build snowflake-connector-python based on apache nanoarrow project, which
52+
reduces the package size as well as removes a hard dependency on a specific version of pyarrow.
53+
- This version also includes the following features and bug fixes from the unreleased v3.0.5:
54+
- Added a feature that lets you add connection definitions to the `config.toml` configuration file. A connection definition refers to a collection of connection parameters. The connection configuration name must begin with **connections**, similar to the following that defines the parameters for the `prod` connection:
55+
56+
```toml
57+
[connections.prod]
58+
account = "my_account"
59+
user = "my_user"
60+
password = "my_password"
61+
```
62+
By default, we look for the `config.toml` file in the location specified in the `SNOWFLAKE_HOME` environment variable (default: `~/.snowflake`). If this folder does not exist, the Python connector looks for the file in the `platformdirs` location, as follows:
63+
64+
- On Linux: `~/.config/snowflake/`, but follows XDG settings
65+
- On Mac: `~/Library/Application Support/snowflake/`
66+
- On Windows: `%USERPROFILE%\AppData\Local\snowflake\`
67+
68+
You can determine which file is used by running the following command:
69+
70+
```
71+
python -c "from snowflake.connector.constants import CONFIG_FILE; print(str(CONFIG_FILE))"
72+
```
73+
- Bumped cryptography dependency from <41.0.0,>=3.1.0 to >=3.1.0,<42.0.0.
74+
- Improved OCSP response caching to remove tmp cache files on Windows.
75+
- Improved OCSP response caching to reduce the times of disk writing.
76+
- Added a parameter `server_session_keep_alive` in `SnowflakeConnection` that skips session deletion when client connection closes.
77+
- Tightened our pinning of platformdirs, to prevent their new releases breaking us.
78+
- Fixed a bug where SFPlatformDirs would incorrectly append application_name/version to its path.
79+
- Added retry reason for queries that are retried by the client.
80+
- Fixed a bug where `write_pandas` fails when user does not have the privilege to create stage or file format in the target schema, but has the right privilege for the current schema.
81+
- Remove Python 3.7 support.
82+
- Worked around a segfault which sometimes occurred during cache serialization in multi-threaded scenarios.
83+
- Improved error handling of connection reset error.
84+
- Fixed a bug about deleting the temporary files happened when running PUT command.
85+
- Allowed to pass `type_mapper` to `fetch_pandas_batches()` and `fetch_pandas_all()`.
86+
- Fixed a bug where pickle.dump segfaults during cache serialization in multi-threaded scenarios.
87+
4988
- v3.0.4(May 23,2023)
5089
- Fixed a bug in which `cursor.execute()` could modify the argument statement_params dictionary object when executing a multistatement query.
5190
- Added the json_result_force_utf8_decoding connection parameter to force decoding JSON content in utf-8 when the result format is JSON.

0 commit comments

Comments
 (0)