Skip to content

Aio connector fix workflows #2475

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 3 commits into
base: dev/aio-connector
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 37 additions & 2 deletions .github/workflows/build_test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -159,6 +159,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand All @@ -173,8 +180,8 @@ jobs:
run: python -m pip install tox>=4
- name: Run tests
# To run a single test on GHA use the below command:
# run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-single-ci | sed 's/ /,/g'`
run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-{extras,unit,integ,pandas,sso}-ci | sed 's/ /,/g'`
# run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-single-ci | sed 's/ /,/g'`
run: python -m tox run -e `echo py${PYTHON_VERSION/\./}-{extras,unit-parallel,integ-parallel,pandas-parallel,sso}-ci | sed 's/ /,/g'`

env:
PYTHON_VERSION: ${{ matrix.python-version }}
Expand Down Expand Up @@ -224,6 +231,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Upgrade setuptools, pip and wheel
run: python -m pip install -U setuptools pip wheel
- name: Install tox
Expand Down Expand Up @@ -285,6 +299,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down Expand Up @@ -332,6 +353,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down Expand Up @@ -396,6 +424,13 @@ jobs:
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PARAMETERS_SECRET" \
.github/workflows/parameters/public/parameters_${{ matrix.cloud-provider }}.py.gpg > test/parameters.py
- name: Setup private key file
shell: bash
env:
PYTHON_PRIVATE_KEY_SECRET: ${{ secrets.PYTHON_PRIVATE_KEY_SECRET }}
run: |
gpg --quiet --batch --yes --decrypt --passphrase="$PYTHON_PRIVATE_KEY_SECRET" \
.github/workflows/parameters/public/rsa_keys/rsa_key_python_${{ matrix.cloud-provider }}.p8.gpg > test/rsa_key_python_${{ matrix.cloud-provider }}.p8
- name: Download wheel(s)
uses: actions/download-artifact@v4
with:
Expand Down
Binary file modified .github/workflows/parameters/public/parameters_aws.py.gpg
Binary file not shown.
Binary file modified .github/workflows/parameters/public/parameters_azure.py.gpg
Binary file not shown.
Binary file modified .github/workflows/parameters/public/parameters_gcp.py.gpg
Binary file not shown.
Binary file not shown.
Binary file not shown.
Binary file not shown.
8 changes: 7 additions & 1 deletion ci/test_fips.sh
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,10 @@ curl https://repo1.maven.org/maven2/org/wiremock/wiremock-standalone/3.11.0/wire
python3 -m venv fips_env
source fips_env/bin/activate
pip install -U setuptools pip

# Install pytest-xdist for parallel execution
pip install pytest-xdist

pip install "${CONNECTOR_WHL}[pandas,secure-local-storage,development]"

echo "!!! Environment description !!!"
Expand All @@ -24,6 +28,8 @@ python -c "from cryptography.hazmat.backends.openssl import backend;print('Cryp
pip freeze

cd $CONNECTOR_DIR
pytest -vvv --cov=snowflake.connector --cov-report=xml:coverage.xml test --ignore=test/integ/aio --ignore=test/unit/aio

# Run tests in parallel using pytest-xdist
pytest -n auto -vvv --cov=snowflake.connector --cov-report=xml:coverage.xml test --ignore=test/integ/aio --ignore=test/unit/aio

deactivate
1 change: 1 addition & 0 deletions ci/test_fips_docker.sh
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ docker run --network=host \
-e cloud_provider \
-e PYTEST_ADDOPTS \
-e GITHUB_ACTIONS \
-e JENKINS_HOME=${JENKINS_HOME:-false} \
--mount type=bind,source="${CONNECTOR_DIR}",target=/home/user/snowflake-connector-python \
${CONTAINER_NAME}:1.0 \
/home/user/snowflake-connector-python/ci/test_fips.sh $1
2 changes: 1 addition & 1 deletion ci/test_linux.sh
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ else
echo "[Info] Testing with ${PYTHON_VERSION}"
SHORT_VERSION=$(python3.10 -c "print('${PYTHON_VERSION}'.replace('.', ''))")
CONNECTOR_WHL=$(ls $CONNECTOR_DIR/dist/snowflake_connector_python*cp${SHORT_VERSION}*manylinux2014*.whl | sort -r | head -n 1)
TEST_LIST=`echo py${PYTHON_VERSION/\./}-{unit,integ,pandas,sso}-ci | sed 's/ /,/g'`
TEST_LIST=`echo py${PYTHON_VERSION/\./}-{unit-parallel,integ,pandas-parallel,sso}-ci | sed 's/ /,/g'`
TEST_ENVLIST=fix_lint,$TEST_LIST,py${PYTHON_VERSION/\./}-coverage
echo "[Info] Running tox for ${TEST_ENVLIST}"

Expand Down
4 changes: 2 additions & 2 deletions src/snowflake/connector/ocsp_snowflake.py
Original file line number Diff line number Diff line change
Expand Up @@ -576,7 +576,7 @@ def _download_ocsp_response_cache(ocsp, url, do_retry: bool = True) -> bool:
response.status_code,
sleep_time,
)
time.sleep(sleep_time)
time.sleep(sleep_time)
else:
logger.error(
"Failed to get OCSP response after %s attempt.", max_retry
Expand Down Expand Up @@ -1649,7 +1649,7 @@ def _fetch_ocsp_response(
response.status_code,
sleep_time,
)
time.sleep(sleep_time)
time.sleep(sleep_time)
except Exception as ex:
if max_retry > 1:
sleep_time = next(backoff)
Expand Down
15 changes: 15 additions & 0 deletions test/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,3 +146,18 @@ def pytest_runtest_setup(item) -> None:
pytest.skip("cannot run this test on public Snowflake deployment")
elif INTERNAL_SKIP_TAGS.intersection(test_tags) and not running_on_public_ci():
pytest.skip("cannot run this test on private Snowflake deployment")

if "auth" in test_tags:
if os.getenv("RUN_AUTH_TESTS") != "true":
pytest.skip("Skipping auth test in current environment")


def get_server_parameter_value(connection, parameter_name: str) -> str | None:
"""Get server parameter value, returns None if parameter doesn't exist."""
try:
with connection.cursor() as cur:
cur.execute(f"show parameters like '{parameter_name}'")
ret = cur.fetchone()
return ret[1] if ret else None
except Exception:
return None
29 changes: 28 additions & 1 deletion test/helpers.py
Original file line number Diff line number Diff line change
Expand Up @@ -198,7 +198,34 @@ def _arrow_error_stream_chunk_remove_single_byte_test(use_table_iterator):
decode_bytes = base64.b64decode(b64data)
exception_result = []
result_array = []
for i in range(len(decode_bytes)):

# Test strategic positions instead of every byte for performance
# Test header (first 50), middle section, end (last 50), and some random positions
data_len = len(decode_bytes)
test_positions = set()

# Critical positions: beginning (headers/metadata)
test_positions.update(range(min(50, data_len)))

# Middle section positions
mid_start = data_len // 2 - 25
mid_end = data_len // 2 + 25
test_positions.update(range(max(0, mid_start), min(data_len, mid_end)))

# End positions
test_positions.update(range(max(0, data_len - 50), data_len))

# Some random positions throughout the data (for broader coverage)
import random

random.seed(42) # Deterministic for reproducible tests
random_positions = random.sample(range(data_len), min(50, data_len))
test_positions.update(random_positions)

# Convert to sorted list for consistent execution
test_positions = sorted(test_positions)

for i in test_positions:
try:
# removing the i-th char in the bytes
iterator = create_nanoarrow_pyarrow_iterator(
Expand Down
91 changes: 70 additions & 21 deletions test/integ/aio/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,14 @@
# Copyright (c) 2012-2023 Snowflake Computing Inc. All rights reserved.
#

import os
from contextlib import asynccontextmanager
from test.integ.conftest import get_db_parameters, is_public_testaccount
from typing import AsyncContextManager, Callable, Generator
from test.integ.conftest import (
_get_private_key_bytes_for_olddriver,
get_db_parameters,
is_public_testaccount,
)
from typing import AsyncContextManager, AsyncGenerator, Callable

import pytest

Expand Down Expand Up @@ -44,7 +49,7 @@ async def patch_connection(
self,
con: SnowflakeConnection,
propagate: bool = True,
) -> Generator[TelemetryCaptureHandlerAsync, None, None]:
) -> AsyncGenerator[TelemetryCaptureHandlerAsync, None]:
original_telemetry = con._telemetry
new_telemetry = TelemetryCaptureHandlerAsync(
original_telemetry,
Expand All @@ -57,6 +62,9 @@ async def patch_connection(
con._telemetry = original_telemetry


RUNNING_OLD_DRIVER = os.getenv("TOX_ENV_NAME") == "olddriver"


@pytest.fixture(scope="session")
def capture_sf_telemetry_async() -> TelemetryCaptureFixtureAsync:
return TelemetryCaptureFixtureAsync()
Expand All @@ -71,6 +79,22 @@ async def create_connection(connection_name: str, **kwargs) -> SnowflakeConnecti
"""
ret = get_db_parameters(connection_name)
ret.update(kwargs)

# Handle private key authentication for old driver if applicable
if RUNNING_OLD_DRIVER and "private_key_file" in ret and "private_key" not in ret:
private_key_file = ret.get("private_key_file")
if private_key_file:
private_key_bytes = _get_private_key_bytes_for_olddriver(private_key_file)
ret["authenticator"] = "SNOWFLAKE_JWT"
ret["private_key"] = private_key_bytes
ret.pop("private_key_file", None)

# If authenticator is explicitly provided and it's not key-pair based, drop key-pair fields
authenticator_value = ret.get("authenticator")
if authenticator_value.lower() not in {"key_pair_authenticator", "snowflake_jwt"}:
ret.pop("private_key", None)
ret.pop("private_key_file", None)

connection = SnowflakeConnection(**ret)
await connection.connect()
return connection
Expand All @@ -80,7 +104,7 @@ async def create_connection(connection_name: str, **kwargs) -> SnowflakeConnecti
async def db(
connection_name: str = "default",
**kwargs,
) -> Generator[SnowflakeConnection, None, None]:
) -> AsyncGenerator[SnowflakeConnection, None]:
if not kwargs.get("timezone"):
kwargs["timezone"] = "UTC"
if not kwargs.get("converter_class"):
Expand All @@ -96,7 +120,7 @@ async def db(
async def negative_db(
connection_name: str = "default",
**kwargs,
) -> Generator[SnowflakeConnection, None, None]:
) -> AsyncGenerator[SnowflakeConnection, None]:
if not kwargs.get("timezone"):
kwargs["timezone"] = "UTC"
if not kwargs.get("converter_class"):
Expand All @@ -116,7 +140,7 @@ def conn_cnx():


@pytest.fixture()
async def conn_testaccount() -> SnowflakeConnection:
async def conn_testaccount() -> AsyncGenerator[SnowflakeConnection, None]:
connection = await create_connection("default")
yield connection
await connection.close()
Expand All @@ -129,18 +153,43 @@ def negative_conn_cnx() -> Callable[..., AsyncContextManager[SnowflakeConnection


@pytest.fixture()
async def aio_connection(db_parameters):
cnx = SnowflakeConnection(
user=db_parameters["user"],
password=db_parameters["password"],
host=db_parameters["host"],
port=db_parameters["port"],
account=db_parameters["account"],
database=db_parameters["database"],
schema=db_parameters["schema"],
warehouse=db_parameters["warehouse"],
protocol=db_parameters["protocol"],
timezone="UTC",
)
yield cnx
await cnx.close()
async def aio_connection(db_parameters) -> AsyncGenerator[SnowflakeConnection, None]:
# Build connection params supporting both password and key-pair auth depending on environment
connection_params = {
"user": db_parameters["user"],
"host": db_parameters["host"],
"port": db_parameters["port"],
"account": db_parameters["account"],
"database": db_parameters["database"],
"schema": db_parameters["schema"],
"protocol": db_parameters["protocol"],
"timezone": "UTC",
}

# Optional fields
warehouse = db_parameters.get("warehouse")
if warehouse is not None:
connection_params["warehouse"] = warehouse

role = db_parameters.get("role")
if role is not None:
connection_params["role"] = role

if "password" in db_parameters and db_parameters["password"]:
connection_params["password"] = db_parameters["password"]
elif "private_key_file" in db_parameters:
# Use key-pair authentication
connection_params["authenticator"] = "SNOWFLAKE_JWT"
if RUNNING_OLD_DRIVER:
private_key_bytes = _get_private_key_bytes_for_olddriver(
db_parameters["private_key_file"]
)
connection_params["private_key"] = private_key_bytes
else:
connection_params["private_key_file"] = db_parameters["private_key_file"]

cnx = SnowflakeConnection(**connection_params)
try:
yield cnx
finally:
await cnx.close()
Loading
Loading