Skip to content

Commit 4c1e953

Browse files
docker example
Signed-off-by: Adrian Cole <[email protected]>
1 parent 905ca0a commit 4c1e953

File tree

4 files changed

+54
-15
lines changed

4 files changed

+54
-15
lines changed
Lines changed: 6 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,15 +1,18 @@
1+
# Update this with your real OpenAI API key
2+
OPENAI_API_KEY=sk-YOUR_API_KEY
3+
14
# Uncomment to use Ollama instead of OpenAI
25
# OPENAI_BASE_URL=http://localhost:11434/v1
36
# OPENAI_API_KEY=unused
47
# CHAT_MODEL=qwen2.5:0.5b
58

69
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
710
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
8-
OTEL_SERVICE_NAME=opentelemetry-instrumentation-openai-v2
11+
OTEL_SERVICE_NAME=opentelemetry-python-openai
912

10-
# This automatically configure logging
13+
# Change to 'false' to disable logging
1114
OTEL_PYTHON_LOGGING_AUTO_INSTRUMENTATION_ENABLED=true
1215
# Change to 'console' if your OTLP endpoint doesn't support logs
1316
OTEL_LOGS_EXPORTER=otlp_proto_http
14-
# Set to false to hide prompt and completion content
17+
# Change to 'false' to hide prompt and completion content
1518
OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,10 @@
1+
# Use an alpine image to make the runtime smaller
2+
FROM docker.io/python:3.12.7-alpine3.20
3+
RUN python -m pip install --upgrade pip
4+
5+
COPY /requirements.txt /tmp/requirements.txt
6+
RUN pip install -r /tmp/requirements.txt
7+
8+
COPY main.py /
9+
10+
CMD [ "opentelemetry-instrument", "python", "main.py" ]

instrumentation-genai/opentelemetry-instrumentation-openai-v2/example/README.rst

Lines changed: 29 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -4,14 +4,36 @@ OpenTelemetry OpenAI Instrumentation Example
44
This is an example of how to instrument OpenAI calls with zero code changes,
55
using `opentelemetry-instrument`.
66

7-
When [main.py](main.py) is run, it exports traces and logs to an
8-
OTLP-compatible endpoint. Traces include details such as the model used and the
7+
When `main.py <main.py>`_ is run, it exports traces and logs to an OTLP
8+
compatible endpoint. Traces include details such as the model used and the
99
duration of the chat request. Logs capture the chat request and the generated
1010
response, providing a comprehensive view of the performance and behavior of
1111
your OpenAI requests.
1212

13-
Installation
14-
------------
13+
Setup
14+
-----
15+
16+
Minimally, update the `.env <.env>`_ file with your "OPENAI_API_KEY". An
17+
OTLP compatible endpoint should be listening for traces and logs on
18+
http://localhost:4318. If not, update "OTEL_EXPORTER_OTLP_ENDPOINT" as well.
19+
20+
Run with Docker
21+
---------------
22+
23+
If you have Docker installed, you can run the example in one step:
24+
25+
::
26+
27+
docker-compose run --build --rm python-opentelemetry-openai
28+
29+
You should see a poem generated by OpenAI while traces and logs export to your
30+
configured observability tool.
31+
32+
Run with Python
33+
---------------
34+
35+
If you prefer to run the example with Python, set up a virtual environment for
36+
the example like this:
1537

1638
::
1739

@@ -20,16 +42,11 @@ Installation
2042
pip install "python-dotenv[cli]"
2143
pip install -r requirements.txt
2244

23-
Running the Example
24-
-------------------
25-
26-
Update the `.env` file with your OpenAI API key, or to change where to export
27-
traces and logs. Then, run the example like this:
45+
Now, run the example like this:
2846

2947
::
3048

3149
dotenv run -- opentelemetry-instrument python main.py
3250

33-
You should see a poem generated by OpenAI, with traces and logs exported to
34-
and OTLP compatible endpoint on localhost. You can then view them in your
35-
preferred observability tool.
51+
You should see a poem generated by OpenAI while traces and logs export to your
52+
configured observability tool.
Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
services:
2+
opentelemetry-python-openai:
3+
container_name: opentelemetry-python-openai
4+
build:
5+
context: .
6+
env_file:
7+
- .env
8+
environment:
9+
OTEL_EXPORTER_OTLP_ENDPOINT: "http://host.docker.internal:4318"

0 commit comments

Comments
 (0)