Skip to content

Cherrypicks to aio connector part8 #2458

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 23 commits into
base: dev/aio-connector
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
23 commits
Select commit Hold shift + click to select a range
e209415
SNOW-2021009: test optimisation (#2388)
sfc-gh-pcyrek Jul 9, 2025
407d131
SNOW-2226057: GH Actions moved to key-pair, old driver bump to 3.1.0 …
sfc-gh-pcyrek Aug 6, 2025
322852d
Apply changes to async tests and workflows
sfc-gh-pczajka Aug 13, 2025
6035eee
SNOW-2027116 Allow for UUID encoding in SnowflakeRestful interface (#…
sfc-gh-lspiegelberg Apr 9, 2025
6d59d70
[ASYNC] apply #2254 to async code
sfc-gh-pczajka Aug 7, 2025
912a19d
SNOW-1955965: Fix expired S3 credentials update (#2258)
sfc-gh-pbulawa Apr 10, 2025
dbb7007
[ASYNC] apply #2258 to async code
sfc-gh-pczajka Aug 7, 2025
153ecd9
NO-SNOW Add PAT to authenticators allowing empty username, remove han…
sfc-gh-mkubik Apr 10, 2025
106fa06
[ASYNC] Apply #2264 to async code
sfc-gh-pczajka Aug 7, 2025
f6b293b
NO-SNOW Fix flaky query timeout test (#2266)
sfc-gh-mkubik Apr 11, 2025
45f634d
SNOW-2040000 change tag to bptp-stable (#2268)
sfc-gh-akolodziejczyk Apr 14, 2025
9e5b250
SNOW-2028051 introduce a new client_fetch_threads connection paramete…
sfc-gh-mmishchenko Apr 14, 2025
2d4b87e
Add default entra app ID for Snowflake (#2267)
sfc-gh-pmansour Apr 15, 2025
b7564b8
[ASYNC] update test after #2267
sfc-gh-pczajka Aug 7, 2025
2c9393d
SNOW-2011595 Masking filter introduced on library levels (#2253)
sfc-gh-fpawlowski Apr 15, 2025
3ef0650
[ASYNC] remove azure filter after #2253
sfc-gh-pczajka Aug 7, 2025
35199cd
NO-SNOW acquiring a lock on local OCSP cache will use a timeout (#2280)
sfc-gh-mmishchenko Apr 17, 2025
bc7162b
Accept both v1 and v2 Entra ID issuer formats for WIF (#2281)
sfc-gh-pmansour Apr 17, 2025
7f56afa
[ASYNC] apply #2281 to async code
sfc-gh-pczajka Aug 7, 2025
4d5ed38
SNOW-2048239 revert zero timeout for oscp cache lock (#2283)
sfc-gh-mmishchenko Apr 18, 2025
404bf8e
[ASYNC] remove flaky marker from OCSP tests after #2283
sfc-gh-pczajka Aug 7, 2025
2849159
SNOW-1993520 update tested_reqs for 3.14.1 release (#2287)
sfc-gh-mmishchenko Apr 21, 2025
eb23308
Review fixes - add secret filter to external libraries + tests
sfc-gh-pczajka Aug 12, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
41 changes: 38 additions & 3 deletions .github/workflows/build_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ on:
description: "Test scenario tags"

concurrency:
# older builds for the same pull request numer or branch should be cancelled
# older builds for the same pull request number or branch should be cancelled
cancel-in-progress: true
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}

Expand Down Expand Up @@ -159,6 +159,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand All @@ -173,8 +180,8 @@ jobs:
run: python -m pip install tox>=4
- name: Run tests
# To run a single test on GHA use the below command:
# run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-single-ci | sed 's/ /,/g'`
run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-{extras,unit,integ,pandas,sso}-ci | sed 's/ /,/g'`
# run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-single-ci | sed 's/ /,/g'`
run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-{extras,unit-parallel,integ-parallel,pandas-parallel,sso}-ci | sed 's/ /,/g'`

env:
PYTHON_VERSION: ${{ matrix.python-version }}
Expand Down Expand Up @@ -224,6 +231,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Upgrade setuptools, pip and wheel
run: python -m pip install -U setuptools pip wheel
- name: Install tox
Expand Down Expand Up @@ -285,6 +299,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down Expand Up @@ -332,6 +353,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down Expand Up @@ -396,6 +424,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down
Binary file modified .github/workflows/parameters/public/parameters_aws.py.gpg
Binary file not shown.
Binary file modified .github/workflows/parameters/public/parameters_azure.py.gpg
Binary file not shown.
Binary file modified .github/workflows/parameters/public/parameters_gcp.py.gpg
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
4 changes: 2 additions & 2 deletions Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,10 @@ timestamps {
stage('Test') {
try {
def commit_hash = "main" // default which we want to override
def bptp_tag = "bptp-built"
def bptp_tag = "bptp-stable"
def response = authenticatedGithubCall("https://api.github.com/repos/snowflakedb/snowflake/git/ref/tags/${bptp_tag}")
commit_hash = response.object.sha
// Append the bptp-built commit sha to params
// Append the bptp-stable commit sha to params
params += [string(name: 'svn_revision', value: commit_hash)]
} catch(Exception e) {
println("Exception computing commit hash from: ${response}")
Expand Down
8 changes: 7 additions & 1 deletion ci/test_fips.sh
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ curl https://repo1.maven.org/maven2/org/wiremock/wiremock-standalone/3.11.0/wire
python3 -m venv fips_env
source fips_env/bin/activate
pip install -U setuptools pip

# Install pytest-xdist for parallel execution
pip install pytest-xdist

pip install "${CONNECTOR_WHL}[pandas,secure-local-storage,development]"

echo "!!! Environment description !!!"
Expand All @@ -24,6 +28,8 @@ python -c "from cryptography.hazmat.backends.openssl import backend;print('Cryp
pip freeze

cd $CONNECTOR_DIR
pytest -vvv --cov=snowflake.connector --cov-report=xml:coverage.xml test --ignore=test/integ/aio --ignore=test/unit/aio

# Run tests in parallel using pytest-xdist
pytest -n auto -vvv --cov=snowflake.connector --cov-report=xml:coverage.xml test --ignore=test/integ/aio --ignore=test/unit/aio

deactivate
1 change: 1 addition & 0 deletions ci/test_fips_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ docker run --network=host \
-e cloud_provider \
-e PYTEST_ADDOPTS \
-e GITHUB_ACTIONS \
-e JENKINS_HOME=${JENKINS_HOME:-false} \
--mount type=bind,source="${CONNECTOR_DIR}",target=/home/user/snowflake-connector-python \
${CONTAINER_NAME}:1.0 \
/home/user/snowflake-connector-python/ci/test_fips.sh $1
2 changes: 1 addition & 1 deletion ci/test_linux.sh
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ else
echo "[Info] Testing with ${PYTHON_VERSION}"
SHORT_VERSION=$(python3.10 -c "print('${PYTHON_VERSION}'.replace('.', ''))")
CONNECTOR_WHL=$(ls $CONNECTOR_DIR/dist/snowflake_connector_python*cp${SHORT_VERSION}*manylinux2014*.whl | sort -r | head -n 1)
TEST_LIST=`echo py${PYTHON_VERSION/\./}-{unit,integ,pandas,sso}-ci | sed 's/ /,/g'`
TEST_LIST=`echo py${PYTHON_VERSION/\./}-{unit-parallel,integ,pandas-parallel,sso}-ci | sed 's/ /,/g'`
TEST_ENVLIST=fix_lint,$TEST_LIST,py${PYTHON_VERSION/\./}-coverage
echo "[Info] Running tox for ${TEST_ENVLIST}"

Expand Down
3 changes: 3 additions & 0 deletions src/snowflake/connector/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@
import logging
from logging import NullHandler

from snowflake.connector.externals_utils.externals_setup import setup_external_libraries

from .connection import SnowflakeConnection
from .cursor import DictCursor
from .dbapi import (
Expand Down Expand Up @@ -48,6 +50,7 @@
from .version import VERSION

logging.getLogger(__name__).addHandler(NullHandler())
setup_external_libraries()


@wraps(SnowflakeConnection.__init__)
Expand Down
3 changes: 0 additions & 3 deletions src/snowflake/connector/aio/_azure_storage_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@

import aiohttp

from ..azure_storage_client import AzureCredentialFilter
from ..azure_storage_client import (
SnowflakeAzureRestClient as SnowflakeAzureRestClientSync,
)
Expand All @@ -37,8 +36,6 @@

logger = getLogger(__name__)

getLogger("aiohttp").addFilter(AzureCredentialFilter())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't we add aiohttp to MODULES_TO_MASK_LOGS_NAMES in externals_setup.py as well as aioboto3 -> we do not control their logs and there might be some HTTP related logs which can accidently include signatures/keys

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fixed



class SnowflakeAzureRestClient(
SnowflakeStorageClientAsync, SnowflakeAzureRestClientSync
Expand Down
2 changes: 0 additions & 2 deletions src/snowflake/connector/aio/_connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -306,8 +306,6 @@ async def __open_connection(self):
backoff_generator=self._backoff_generator,
)
elif self._authenticator == PROGRAMMATIC_ACCESS_TOKEN:
if not self._token and self._password:
self._token = self._password
self.auth_class = AuthByPAT(self._token)
elif self._authenticator == USR_PWD_MFA_AUTHENTICATOR:
self._session_parameters[PARAMETER_CLIENT_REQUEST_MFA_TOKEN] = (
Expand Down
2 changes: 1 addition & 1 deletion src/snowflake/connector/aio/_file_transfer_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -195,7 +195,7 @@ def transfer_done_cb(
) -> None:
# Note: chunk_id is 0 based while num_of_chunks is count
logger.debug(
f"Chunk {chunk_id}/{done_client.num_of_chunks} of file {done_client.meta.name} reached callback"
f"Chunk(id: {chunk_id}) {chunk_id+1}/{done_client.num_of_chunks} of file {done_client.meta.name} reached callback"
)
if task.exception():
done_client.failed_transfers += 1
Expand Down
13 changes: 9 additions & 4 deletions src/snowflake/connector/aio/_network.py
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,12 @@
)
from ..network import SessionPool as SessionPoolSync
from ..network import SnowflakeRestful as SnowflakeRestfulSync
from ..network import get_http_retryable_error, is_login_request, is_retryable_http_code
from ..network import (
SnowflakeRestfulJsonEncoder,
get_http_retryable_error,
is_login_request,
is_retryable_http_code,
)
from ..secret_detector import SecretDetector
from ..sqlstate import (
SQLSTATE_CONNECTION_NOT_EXISTS,
Expand Down Expand Up @@ -236,7 +241,7 @@ async def request(
return await self._post_request(
url,
headers,
json.dumps(body),
json.dumps(body, cls=SnowflakeRestfulJsonEncoder),
token=self.token,
_no_results=_no_results,
timeout=timeout,
Expand Down Expand Up @@ -298,7 +303,7 @@ async def _token_request(self, request_type):
ret = await self._post_request(
url,
headers,
json.dumps(body),
json.dumps(body, cls=SnowflakeRestfulJsonEncoder),
token=header_token,
)
if ret.get("success") and ret.get("data", {}).get("sessionToken"):
Expand Down Expand Up @@ -396,7 +401,7 @@ async def delete_session(self, retry: bool = False) -> None:
ret = await self._post_request(
url,
headers,
json.dumps(body),
json.dumps(body, cls=SnowflakeRestfulJsonEncoder),
token=self.token,
timeout=5,
no_retry=True,
Expand Down
3 changes: 3 additions & 0 deletions src/snowflake/connector/aio/_s3_storage_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,9 @@ def generate_authenticated_url_and_args_v4() -> tuple[str, dict[str, bytes]]:
amzdate = t.strftime("%Y%m%dT%H%M%SZ")
short_amzdate = amzdate[:8]
x_amz_headers["x-amz-date"] = amzdate
x_amz_headers["x-amz-security-token"] = self.credentials.creds.get(
"AWS_TOKEN", ""
)

(
canonical_request,
Expand Down
7 changes: 7 additions & 0 deletions src/snowflake/connector/aio/_storage_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -193,6 +193,7 @@ async def _send_request_with_retry(
conn = self.meta.sfagent._cursor._connection

while self.retry_count[retry_id] < self.max_retry:
logger.debug(f"retry #{self.retry_count[retry_id]}")
cur_timestamp = self.credentials.timestamp
url, rest_kwargs = get_request_args()
# rest_kwargs["timeout"] = (REQUEST_CONNECTION_TIMEOUT, REQUEST_READ_TIMEOUT)
Expand All @@ -208,10 +209,14 @@ async def _send_request_with_retry(
)

if await self._has_expired_presigned_url(response):
logger.debug(
"presigned url expired. trying to update presigned url."
)
await self._update_presigned_url()
else:
self.last_err_is_presigned_url = False
if response.status in self.TRANSIENT_HTTP_ERR:
logger.debug(f"transient error: {response.status}")
await asyncio.sleep(
min(
# TODO should SLEEP_UNIT come from the parent
Expand All @@ -222,7 +227,9 @@ async def _send_request_with_retry(
)
self.retry_count[retry_id] += 1
elif await self._has_expired_token(response):
logger.debug("token is expired. trying to update token")
self.credentials.update(cur_timestamp)
self.retry_count[retry_id] += 1
else:
return response
except self.TRANSIENT_ERRORS as e:
Expand Down
5 changes: 4 additions & 1 deletion src/snowflake/connector/aio/_wif_util.py
Original file line number Diff line number Diff line change
Expand Up @@ -202,7 +202,10 @@ async def create_azure_attestation(
issuer, subject = extract_iss_and_sub_without_signature_verification(jwt_str)
if not issuer or not subject:
return None
if not issuer.startswith("https://sts.windows.net/"):
if not (
issuer.startswith("https://sts.windows.net/")
or issuer.startswith("https://login.microsoftonline.com/")
):
# This might happen if we're running on a different platform that responds to the same metadata request signature as Azure.
logger.debug("Unexpected Azure token issuer '%s'", issuer)
return None
Expand Down
18 changes: 1 addition & 17 deletions src/snowflake/connector/azure_storage_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import os
import xml.etree.ElementTree as ET
from datetime import datetime, timezone
from logging import Filter, getLogger
from logging import getLogger
from random import choice
from string import hexdigits
from typing import TYPE_CHECKING, Any, NamedTuple
Expand Down Expand Up @@ -41,22 +41,6 @@ class AzureLocation(NamedTuple):
MATDESC = "x-ms-meta-matdesc"


class AzureCredentialFilter(Filter):
LEAKY_FMT = '%s://%s:%s "%s %s %s" %s %s'

def filter(self, record):
if record.msg == AzureCredentialFilter.LEAKY_FMT and len(record.args) == 8:
record.args = (
record.args[:4] + (record.args[4].split("?")[0],) + record.args[5:]
)
return True


getLogger("snowflake.connector.vendored.urllib3.connectionpool").addFilter(
AzureCredentialFilter()
)


class SnowflakeAzureRestClient(SnowflakeStorageClient):
def __init__(
self,
Expand Down
16 changes: 14 additions & 2 deletions src/snowflake/connector/connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,6 +120,7 @@

DEFAULT_CLIENT_PREFETCH_THREADS = 4
MAX_CLIENT_PREFETCH_THREADS = 10
MAX_CLIENT_FETCH_THREADS = 1024
DEFAULT_BACKOFF_POLICY = exponential_backoff()


Expand Down Expand Up @@ -222,6 +223,7 @@ def _get_private_bytes_from_file(
(type(None), int),
), # snowflake
"client_prefetch_threads": (4, int), # snowflake
"client_fetch_threads": (None, (type(None), int)),
"numpy": (False, bool), # snowflake
"ocsp_response_cache_filename": (None, (type(None), str)), # snowflake internal
"converter_class": (DefaultConverterClass(), SnowflakeConverter),
Expand Down Expand Up @@ -380,6 +382,7 @@ class SnowflakeConnection:
See the backoff_policies module for details and implementation examples.
client_session_keep_alive_heartbeat_frequency: Heartbeat frequency to keep connection alive in seconds.
client_prefetch_threads: Number of threads to download the result set.
client_fetch_threads: Number of threads to fetch staged query results.
rest: Snowflake REST API object. Internal use only. Maybe removed in a later release.
application: Application name to communicate with Snowflake as. By default, this is "PythonConnector".
errorhandler: Handler used with errors. By default, an exception will be raised on error.
Expand Down Expand Up @@ -639,6 +642,16 @@ def client_prefetch_threads(self, value) -> None:
self._client_prefetch_threads = value
self._validate_client_prefetch_threads()

@property
def client_fetch_threads(self) -> int | None:
return self._client_fetch_threads

@client_fetch_threads.setter
def client_fetch_threads(self, value: None | int) -> None:
if value is not None:
value = min(max(1, value), MAX_CLIENT_FETCH_THREADS)
self._client_fetch_threads = value

@property
def rest(self) -> SnowflakeRestful | None:
return self._rest
Expand Down Expand Up @@ -1161,8 +1174,6 @@ def __open_connection(self):
backoff_generator=self._backoff_generator,
)
elif self._authenticator == PROGRAMMATIC_ACCESS_TOKEN:
if not self._token and self._password:
self._token = self._password
self.auth_class = AuthByPAT(self._token)
elif self._authenticator == WORKLOAD_IDENTITY_AUTHENTICATOR:
if ENV_VAR_EXPERIMENTAL_AUTHENTICATION not in os.environ:
Expand Down Expand Up @@ -1325,6 +1336,7 @@ def __config(self, **kwargs):
OAUTH_AUTHENTICATOR,
NO_AUTH_AUTHENTICATOR,
WORKLOAD_IDENTITY_AUTHENTICATOR,
PROGRAMMATIC_ACCESS_TOKEN,
}

if not (self._master_token and self._session_token):
Expand Down
7 changes: 4 additions & 3 deletions src/snowflake/connector/cursor.py
Original file line number Diff line number Diff line change
Expand Up @@ -888,8 +888,8 @@ def execute(
_exec_async: Whether to execute this query asynchronously.
_no_retry: Whether or not to retry on known errors.
_do_reset: Whether or not the result set needs to be reset before executing query.
_put_callback: Function to which GET command should call back to.
_put_azure_callback: Function to which an Azure GET command should call back to.
_put_callback: Function to which PUT command should call back to.
_put_azure_callback: Function to which an Azure PUT command should call back to.
_put_callback_output_stream: The output stream a PUT command's callback should report on.
_get_callback: Function to which GET command should call back to.
_get_azure_callback: Function to which an Azure GET command should call back to.
Expand Down Expand Up @@ -1186,7 +1186,8 @@ def _init_result_and_meta(self, data: dict[Any, Any]) -> None:
self._result_set = ResultSet(
self,
result_chunks,
self._connection.client_prefetch_threads,
self._connection.client_fetch_threads
or self._connection.client_prefetch_threads,
)
self._rownumber = -1
self._result_state = ResultState.VALID
Expand Down
Empty file.
Loading
Loading