Skip to content

Support async APIs in LLM Integrations. #3496

@vetyy

Description

@vetyy

Problem Statement

Hello,

I would like to use Sentry's new LLM monitoring feature, but I am using async calls with different 3rd party APIs.
In my case it is mainly Anthropic and OpenAI integration.

The problem is that the AnthropicIntegration and OpenAIIntegration doesn't work because currently the sentry-python library only patches synchronous functions.

For example:

OpenAI patches these:

        Completions.create = _wrap_chat_completion_create(Completions.create)
        Embeddings.create = _wrap_embeddings_create(Embeddings.create)

Anthropic patches these:

        Messages.create = _wrap_message_create(Messages.create)

But, it should be fairly simple to patch also async functions.

Solution Brainstorm

Most of the code in the wrap_message_create function, should be the same.

It seems that the only difference is:

Anthropic patches:

        AsyncMessages.create = _wrap_async_message_create(AsyncMessages.create)

Make _sentry_patched_create async and await the wrapped function.

For example, something like this:

async def _sentry_patched_create_async(*args, **kwargs):
    ...
    return await f(*args, **kwargs)
    ...

OpenAI patches:

        AsyncCompletions.create = _wrap_chat_completion_create(AsyncCompletions.create)
        AsyncEmbeddings.create = _wrap_embeddings_create(AsyncEmbeddings.create)

etc. etc.

I tested it locally by changeing the original code to this and it seem to work okay.
Would you be able to add this feature?

Metadata

Metadata

Assignees

No one assigned

    Projects

    Status

    Waiting for: Product Owner

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions