-
Notifications
You must be signed in to change notification settings - Fork 0
Decorators support #3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: instrumentation-genai-langchain-sdk-cleaned
Are you sure you want to change the base?
Decorators support #3
Conversation
Signed-off-by: Pavan Sudheendra <[email protected]>
Signed-off-by: Pavan Sudheendra <[email protected]>
instrumentation-genai/opentelemetry-genai-sdk/src/opentelemetry/genai/sdk/decorators/base.py
Outdated
Show resolved
Hide resolved
…vider if not already set Signed-off-by: Pavan Sudheendra <[email protected]>
instrumentation-genai/opentelemetry-genai-sdk/src/opentelemetry/genai/sdk/decorators/base.py
Outdated
Show resolved
Hide resolved
instrumentation-genai/opentelemetry-genai-sdk/src/opentelemetry/genai/sdk/exporters.py
Outdated
Show resolved
Hide resolved
Signed-off-by: Pavan Sudheendra <[email protected]>
…SpanMetricEvent exporter Signed-off-by: Pavan Sudheendra <[email protected]>
…umentation-genai-langchain-sdk-cleaned
...rumentation-genai/opentelemetry-genai-sdk/src/opentelemetry/genai/sdk/decorators/__init__.py
Show resolved
Hide resolved
run_id = uuid4() | ||
|
||
try: | ||
telemetry.start_llm(prompts=messages, run_id=run_id) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how are we planning to get attributes which langchain(callback_handler) provides?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
decorators probably won't be able to fully be 1:1 compatible with the callback handlers. It will try it's best to infer the input args and response and fill the span attributes.
try: | ||
import contextlib | ||
with contextlib.suppress(Exception): | ||
telemetry.stop_llm(run_id, chat_generations, **attributes) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
how are we planning to get attributes which langchain(callback_handler) provides?
…/github.com/wrisa/opentelemetry-python-contrib into instrumentation-genai-langchain-sdk-cleaned
Signed-off-by: Pavan Sudheendra <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A few high-level notes (has been covered elsewhere, but adding them here for completeness):
This project is a utility but it's in a directory with only instrumentations, so we'll probably want to move it (there's also a build-time script in Contrib that expects only instrumentations to live in this dir).
The term "SDK" in OTel-land tends to refer to "the SDK" -- i.e. the reference implementation of the OTel API, so we'll probably want to use a different project name. Seems like folks have been converging on "util" for this functionality.
This project depends on the OTel SDK to set up instrumentation, but OTel tends to be SDK agnostic (users should be able to supply their own SDK). If functionality from the SDK ends up being desirable, it should probably go into a different project, keeping this one SDK agnostic.
|
||
from opentelemetry import trace | ||
|
||
def _ensure_tracer_provider(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you envision setting up an SDK within this library to remain in scope?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@pmcollins ideally that is how it be consumed. This SDK will be one level above the existing instrumentations within the repo. IMO, this SDK can be used if users want to apply decorators to capture framework agnostic telemetry. I didn't know which repo this will be eventually go to, but open for suggestions.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that decorators can be supported without reference to "the" OTel SDK (or any OTel SDK). Folks who supply their own SDK should be able to use instrumentations that use this library. IMO if we really need to bring in the SDK it should be done in a separate package.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @pmcollins. I think I understand now, and have pushed a fix. It's now up to the client app to set up the TraceProvider
and MetricProvider
etc.
The other "SDK" references are just GenAI utils API functionality, I will assume it will be renamed correctly when it gets merged. let me know if this is okay.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Excellent, thank you. Can we include the rename from sdk to util in this PR as well?
Signed-off-by: Pavan Sudheendra <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Appreciate the included example demonstrating how to use one of the annotations -- do you have plans to add more examples for the other annotations, and perhaps unit tests? Would be helpful to reviewers to understand how this library is intended to be used.
Also, I believe we're going to want to split these changes up into smaller PRs when submitting to upstream. Might be a useful exercise to do this now during the pre-review process as well as it helps the review process move significantly faster.
Description
@llm
decorators supportDecorators will not replace auto-instrumentation but will simply allow someone to alternatively allow annotating their LLM operations among other GenAI types.
Fixes # (issue)
Type of change
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Does This PR Require a Core Repo Change?
Checklist:
See contributing.md for styleguide, changelog guidelines, and more.