Skip to content
Merged
Show file tree
Hide file tree
Changes from 8 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions .github/instructions/general.instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
applyTo: '**'
---
Provide project context and coding guidelines that AI should follow when generating code, answering questions, or reviewing changes.

## General Guidelines

1. **Test-Driven Development**: Write unit tests for all new functions and features. Use `pytest` for Python and appropriate testing frameworks for Node.js.
2. **Environment Variables**: Use [Environment Variables Guide](../../docs/env-vars.md) for configuration. Avoid hardcoding sensitive information.
3. **Documentation**: Prefer self-explanatory code; add documentation only if explicitly requested by the developer. Be concise.
4. **Code Reviews**: Participate in code reviews and provide constructive feedback.
10 changes: 10 additions & 0 deletions .github/instructions/general.md.instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
---
applyTo: '**'
---
Provide project context and coding guidelines that AI should follow when generating code, answering questions, or reviewing changes.

## General Guidelines

1. **Test-Driven Development**: Write unit tests for all new functions and features. Use `pytest` for Python and appropriate testing frameworks for Node.js.
2. **Environment Variables**: Use [Environment Variables Guide](../docs/env-vars.md) for configuration. Avoid hardcoding sensitive information.
3. **Documentation**: Prefer self-explanatory code; add documentation only if explicitly requested by the developer.
11 changes: 11 additions & 0 deletions .github/instructions/node.instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
applyTo: '**/*.js'
---
Provide project context and coding guidelines that AI should follow when generating code, answering questions, or reviewing changes.

## 🛠️Coding Instructions for Node.js in This Repository

* Use ES6+ syntax and features.
* Follow the `package.json` configuration for dependencies and scripts.
* Use `eslint` for linting and `prettier` for code formatting.
* Write modular and reusable code, adhering to the project's structure.
Original file line number Diff line number Diff line change
@@ -1,14 +1,7 @@
# GitHub Copilot Instructions

This document provides guidelines and best practices for using GitHub Copilot in the `osparc-simcore` repository and other Python and Node.js projects.

## General Guidelines

1. **Test-Driven Development**: Write unit tests for all new functions and features. Use `pytest` for Python and appropriate testing frameworks for Node.js.
2. **Environment Variables**: Use [Environment Variables Guide](../docs/env-vars.md) for configuration. Avoid hardcoding sensitive information.
3. **Documentation**: Prefer self-explanatory code; add documentation only if explicitly requested by the developer.

---
applyTo: '**/*.py'
---
Provide project context and coding guidelines that AI should follow when generating code, answering questions, or reviewing changes.

## 🛠️Coding Instructions for Python in This Repository

Expand All @@ -25,8 +18,8 @@ Follow these rules **strictly** when generating Python code:

### 3. **Code Style & Formatting**

* Follow [Python Coding Conventions](../docs/coding-conventions.md) **strictly**.
* Format code with `black`.
* Follow [Python Coding Conventions](../../docs/coding-conventions.md) **strictly**.
* Format code with `black` and `ruff`.
* Lint code with `ruff` and `pylint`.

### 4. **Library Compatibility**
Expand All @@ -51,11 +44,6 @@ Ensure compatibility with the following library versions:
* Prefer `json_dumps` / `json_loads` from `common_library.json_serialization` instead of the built-in `json.dumps` / `json.loads`.
* When using Pydantic models, prefer methods like `model.model_dump_json()` for serialization.

---

## 🛠️Coding Instructions for Node.js in This Repository

* Use ES6+ syntax and features.
* Follow the `package.json` configuration for dependencies and scripts.
* Use `eslint` for linting and `prettier` for code formatting.
* Write modular and reusable code, adhering to the project's structure.
### 7. **Running tests**
* Use `--keep-docker-up` flag when testing to keep docker containers up between sessions.
* Always activate the python virtual environment before running pytest.
39 changes: 22 additions & 17 deletions packages/common-library/src/common_library/async_tools.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,32 +89,37 @@ async def cancel_wait_task(
TimeoutError: raised if cannot cancel the task.
CancelledError: raised ONLY if owner is being cancelled.
"""
if task.done():
# nothing to do here
return

cancelling = task.cancel()
if not cancelling:
return # task was alredy cancelled

assert task.cancelling() # nosec
assert not task.cancelled() # nosec

# mark for cancellation
task.cancel("cancel_wait_task was called to cancel this task")
try:

_logger.debug("Cancelling task %s", task.get_name())
await asyncio.shield(
# NOTE shield ensures that cancellation of the caller function won't stop you
# from observing the cancellation/finalization of task.
asyncio.wait_for(task, timeout=max_delay)
)

except asyncio.CancelledError:
if not task.cancelled():
# task owner function is being cancelled -> propagate cancellation
raise

# else: task cancellation is complete, we can safely ignore it
_logger.debug(
"Task %s cancellation is complete",
except TimeoutError:
_logger.exception(
"Timeout while cancelling task %s after %s seconds",
task.get_name(),
max_delay,
)
raise
except asyncio.CancelledError:
current_task = asyncio.current_task()
assert current_task is not None # nosec
if current_task.cancelling() > 0:
# owner function is being cancelled -> propagate cancellation
raise
finally:
if not task.done():
_logger.error("Failed to cancel %s", task.get_name())
else:
_logger.debug("Task %s cancelled", task.get_name())


def delayed_start(
Expand Down
7 changes: 3 additions & 4 deletions packages/service-library/src/servicelib/async_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,12 @@
import logging
from collections import deque
from collections.abc import Awaitable, Callable
from contextlib import suppress
from dataclasses import dataclass
from functools import wraps
from typing import TYPE_CHECKING, Any, ParamSpec, TypeVar

from common_library.async_tools import cancel_wait_task

from . import tracing
from .utils_profiling_middleware import dont_profile, is_profiling, profile_context

Expand Down Expand Up @@ -54,9 +55,7 @@ async def _safe_cancel(context: Context) -> None:
try:
await context.in_queue.put(None)
if context.task is not None:
context.task.cancel()
with suppress(asyncio.CancelledError):
await context.task
await cancel_wait_task(context.task, max_delay=None)
except RuntimeError as e:
if "Event loop is closed" in f"{e}":
_logger.warning("event loop is closed and could not cancel %s", context)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -57,6 +57,11 @@ async def handle_run(
result_to_return = TaskResultSuccess(value=task_result)
except asyncio.CancelledError:
result_to_return = TaskResultCancelledError()
current_task = asyncio.current_task()
assert current_task is not None # nosec
if current_task.cancelling() > 0:
# owner function is being cancelled -> propagate cancellation
raise
except Exception as e: # pylint:disable=broad-exception-caught
result_to_return = TaskResultError(
error=_format_exception(e),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
from functools import wraps
from typing import Any, Protocol

from common_library.async_tools import cancel_wait_task
from fastapi import Request, status
from fastapi.exceptions import HTTPException

Expand All @@ -13,8 +14,7 @@
class _HandlerWithRequestArg(Protocol):
__name__: str

async def __call__(self, request: Request, *args: Any, **kwargs: Any) -> Any:
...
async def __call__(self, request: Request, *args: Any, **kwargs: Any) -> Any: ...


def _validate_signature(handler: _HandlerWithRequestArg):
Expand Down Expand Up @@ -75,13 +75,8 @@ async def wrapper(request: Request, *args, **kwargs):

# One has completed, cancel the other
for t in pending:
t.cancel()

try:
await asyncio.wait_for(t, timeout=3)

except asyncio.CancelledError:
pass
await cancel_wait_task(t, max_delay=3)
except Exception: # pylint: disable=broad-except
if t is handler_task:
raise
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -335,6 +335,11 @@ async def _tasks_monitor(self) -> None:
result_field = ResultField(
str_error=dumps(TaskCancelledError(task_id=task_id))
)
current_task = asyncio.current_task()
assert current_task is not None # nosec
if current_task.cancelling() > 0:
# owner function is being cancelled -> propagate cancellation
raise
except Exception as e: # pylint:disable=broad-except
allowed_errors = TaskRegistry.get_allowed_errors(
task_data.registered_task_name
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@

import arrow
import redis.exceptions
from common_library.async_tools import cancel_wait_task
from redis.asyncio.lock import Lock

from ..background_task import periodic
Expand Down Expand Up @@ -116,10 +117,12 @@ async def _wrapper(*args: P.args, **kwargs: P.kwargs) -> R:
module_name=coro.__module__, func_name=coro.__name__
),
)

res = await work_task
auto_extend_lock_task.cancel()
return res
# cancel the auto-extend task (work is done)
# NOTE: if we do not explicitely await the task inside the context manager
# it sometimes hangs forever (Python issue?)
await cancel_wait_task(auto_extend_lock_task, max_delay=None)
return res

except BaseExceptionGroup as eg:
# Separate exceptions into LockLostError and others
Expand Down
8 changes: 6 additions & 2 deletions packages/service-library/src/servicelib/redis/_utils.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
import logging
from collections.abc import Awaitable
from typing import Any
from typing import ParamSpec, TypeVar

import redis.exceptions
from redis.asyncio.lock import Lock
Expand Down Expand Up @@ -28,7 +28,11 @@ async def auto_extend_lock(lock: Lock) -> None:
raise LockLostError(lock=lock) from exc


async def handle_redis_returns_union_types(result: Any | Awaitable[Any]) -> Any:
P = ParamSpec("P")
R = TypeVar("R")


async def handle_redis_returns_union_types(result: R | Awaitable[R]) -> R:
"""Used to handle mypy issues with redis 5.x return types"""
if isinstance(result, Awaitable):
return await result
Expand Down
10 changes: 4 additions & 6 deletions packages/service-library/src/servicelib/utils.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
""" General utils
"""General utils

IMPORTANT: lowest level module
I order to avoid cyclic dependences, please
Expand Down Expand Up @@ -245,7 +245,7 @@ async def limited_as_completed(
future.set_name(f"{tasks_group_prefix}-{future.get_name()}")
pending_futures.add(future)

except (StopIteration, StopAsyncIteration): # noqa: PERF203
except (StopIteration, StopAsyncIteration):
completed_all_awaitables = True
if not pending_futures:
return
Expand Down Expand Up @@ -294,8 +294,7 @@ async def limited_gather(
log: logging.Logger = _DEFAULT_LOGGER,
limit: int = _DEFAULT_LIMITED_CONCURRENCY,
tasks_group_prefix: str | None = None,
) -> list[T]:
...
) -> list[T]: ...


@overload
Expand All @@ -305,8 +304,7 @@ async def limited_gather(
log: logging.Logger = _DEFAULT_LOGGER,
limit: int = _DEFAULT_LIMITED_CONCURRENCY,
tasks_group_prefix: str | None = None,
) -> list[T | BaseException]:
...
) -> list[T | BaseException]: ...


async def limited_gather(
Expand Down
2 changes: 1 addition & 1 deletion packages/service-library/tests/redis/test_decorators.py
Original file line number Diff line number Diff line change
Expand Up @@ -267,7 +267,7 @@ async def race_condition_increase(self, by: int) -> None:
self.value = current_value

counter = RaceConditionCounter()
# ensures it does nto time out before acquiring the lock
# ensures it does not time out before acquiring the lock
time_for_all_inc_counter_calls_to_finish = (
with_short_default_redis_lock_ttl * INCREASE_OPERATIONS * 10
)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -203,7 +203,7 @@ async def _sync_services_task(app: FastAPI) -> None:

await asyncio.sleep(app.state.settings.CATALOG_BACKGROUND_TASK_REST_TIME)

except asyncio.CancelledError: # noqa: PERF203
except asyncio.CancelledError:
# task is stopped
_logger.info("registry syncing task cancelled")
raise
Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
# pylint: disable=relative-beyond-top-level

import asyncio
import logging
from copy import deepcopy
from math import floor
Expand Down Expand Up @@ -138,8 +137,6 @@ async def observing_single_service(
try:
await _apply_observation_cycle(scheduler, scheduler_data)
logger.debug("completed observation cycle of %s", f"{service_name=}")
except asyncio.CancelledError: # pylint: disable=try-except-raise
raise # pragma: no cover
except Exception as exc: # pylint: disable=broad-except
service_name = scheduler_data.service_name

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
"""

import asyncio
import contextlib
import functools
import logging
import time
Expand Down Expand Up @@ -131,9 +130,7 @@ async def shutdown(self) -> None:
if self._trigger_observation_queue_task is not None:
await self._trigger_observation_queue.put(None)

self._trigger_observation_queue_task.cancel()
with contextlib.suppress(asyncio.CancelledError):
await self._trigger_observation_queue_task
await cancel_wait_task(self._trigger_observation_queue_task, max_delay=None)
self._trigger_observation_queue_task = None
self._trigger_observation_queue = Queue()

Expand Down
3 changes: 2 additions & 1 deletion services/docker-compose-ops.yml
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ services:
retries: 5

redis-commander:
image: rediscommander/redis-commander:latest
image: ghcr.io/joeferner/redis-commander:latest
init: true
environment:
- >-
Expand All @@ -100,6 +100,7 @@ services:
# If you add/remove a db, do not forget to update the --databases entry in the docker-compose.yml
ports:
- "18081:8081"
user: redis
networks:
- simcore_default
opentelemetry-collector:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ services:
]
redis-commander:
init: true
image: rediscommander/redis-commander:latest
image: ghcr.io/joeferner/redis-commander:latest
restart: always
environment:
- >-
Expand All @@ -89,6 +89,7 @@ services:
documents:redis:6379:10:${TEST_REDIS_PASSWORD}
ports:
- "18081:8081"
user: redis

rabbit:
image: itisfoundation/rabbitmq:4.1.2-management
Expand Down
Loading