Skip to content

Commit e2d9d17

Browse files
authored
docs: A quick start example
2 parents 3e5a729 + bac9b1a commit e2d9d17

File tree

2 files changed

+196
-0
lines changed

2 files changed

+196
-0
lines changed

README.md

Lines changed: 196 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,202 @@ LoongSuite includes the following key components:
1111

1212
## Quick start
1313

14+
LoongSuite Python Agent provides observability for Python applications. Taking [agentscope](https://github.com/modelscope/agentscope) as an example, this document demonstrates how to use OpenTelemetry to collect OTLP data and forward it via LoongCollector to Jaeger.
15+
16+
### INSTALL
17+
18+
Use the following commands to install OpenTelemetry, agentscope, and AgentScopeInstrumentor
19+
20+
```shell
21+
#Opentelemetry
22+
pip install opentelemetry-distro opentelemetry-exporter-otlp
23+
opentelemetry-bootstrap -a install
24+
25+
#agentscope
26+
pip install agentscope
27+
28+
#AgentScopeInstrumentor
29+
git clone https://github.com/alibaba/loongsuite-python-agent.git
30+
cd loongsuite-python-agent
31+
pip install ./instrumentation-genai/opentelemetry-instrumentation-agentscope
32+
```
33+
34+
### RUN
35+
36+
#### Build the Example
37+
38+
Follow the official [agentscope documentation](https://doc.agentscope.io/) to create a sample file named `demo.py`
39+
40+
```python
41+
from agentscope.agents import DialogAgent, UserAgent
42+
from agentscope.message import Msg
43+
from agentscope import msghub
44+
import agentscope
45+
46+
# Initialize via model configuration for simplicity
47+
agentscope.init(
48+
model_configs={
49+
"config_name": "my-qwen-max",
50+
"model_name": "qwen-max",
51+
"model_type": "dashscope_chat",
52+
"api_key": "YOUR-API-KEY",
53+
},
54+
)
55+
angel = DialogAgent(
56+
name="Angel",
57+
sys_prompt="You're a helpful assistant named Angel.",
58+
model_config_name="my-qwen-max",
59+
)
60+
61+
monster = DialogAgent(
62+
name="Monster",
63+
sys_prompt="You're a helpful assistant named Monster.",
64+
model_config_name="my-qwen-max",
65+
)
66+
msg = None
67+
for _ in range(3):
68+
msg = angel(msg)
69+
msg = monster(msg)
70+
71+
```
72+
73+
#### Collecte Data
74+
75+
Run the `demo.py` script using OpenTelemetry
76+
77+
```shell
78+
opentelemetry-instrument \
79+
80+
--traces_exporter console \
81+
82+
--service_name demo \
83+
84+
python demo.py
85+
```
86+
87+
If everything is working correctly, you should see logs similar to the following
88+
89+
```shell
90+
"name": "LLM",
91+
"context": {
92+
"trace_id": "0xa6acb5a45fb2b4383e4238ecd5187f85",
93+
"span_id": "0x7457f1a22004468a",
94+
95+
"trace_state": "[ ]"
96+
97+
},
98+
"kind": "SpanKind.INTERNAL",
99+
"parent_id": null,
100+
"start_time": "2025-05-22T11:13:40.396188Z",
101+
"end_time": "2025-05-22T11:13:41.013896Z",
102+
"status": {
103+
"status_code": "OK"
104+
},
105+
"attributes": {
106+
"gen_ai.prompt.0.message.role": "system",
107+
108+
"gen_ai.prompt.0.message.content": "[ ]",
109+
110+
"gen_ai.prompt.1.message.role": "user",
111+
112+
"gen_ai.prompt.1.message.content": "[ ]",
113+
114+
"gen_ai.response.finish_reasons": "3"
115+
},
116+
117+
"events": [ ],
118+
119+
120+
"links": [ ],
121+
122+
"resource": {
123+
"attributes": {
124+
"telemetry.sdk.language": "python",
125+
"telemetry.sdk.name": "opentelemetry",
126+
"telemetry.sdk.version": "1.33.1",
127+
"service.name": "demo",
128+
"telemetry.auto.version": "0.54b1"
129+
},
130+
"schema_url": ""
131+
}
132+
```
133+
134+
### Forwarding OTLP Data to Jaeger via LoongCollector
135+
136+
#### Launch Jaeger
137+
138+
Launch Jaeger with Docker
139+
140+
```plaintext
141+
docker run --rm --name jaeger \
142+
-e COLLECTOR_ZIPKIN_HOST_PORT=:9411 \
143+
-p 6831:6831/udp \
144+
-p 6832:6832/udp \
145+
-p 5778:5778 \
146+
-p 16686:16686 \
147+
-p 4317:4317 \
148+
-p 4318:4318 \
149+
-p 14250:14250 \
150+
-p 14268:14268 \
151+
-p 14269:14269 \
152+
-p 9411:9411 \
153+
jaegertracing/all-in-one:1.53.0
154+
```
155+
156+
#### Launch LoongCollector
157+
158+
1. Install the latest LoongCollector code based on its [documentation](https://observability.cn/project/loongcollector/quick-start/).
159+
160+
2. Add the following configuration in the `conf/continuous_pipeline_config/local/oltp.yaml` directory:
161+
162+
163+
```plaintext
164+
enable: true
165+
global:
166+
StructureType: v2
167+
inputs:
168+
- Type: service_otlp
169+
Protocals:
170+
GRPC:
171+
Endpoint: 0.0.0.0:6666
172+
flushers:
173+
- Type: flusher_otlp
174+
Traces:
175+
Endpoint: http://127.0.0.1:4317
176+
```
177+
178+
This configuration specifies that LoongCollector will accept OTLP-formatted data over the gRPC protocol on port 6666. It also configures an OTLP flusher to send trace data to the backend at port 4317 (which corresponds to Jaeger). For simplicity, only traces are configured here, but metrics and logs can be added similarly. 
179+
180+
3. Launch  LoongCollector with the following command:
181+
182+
183+
```plaintext
184+
nohup ./loongcollector > stdout.log 2> stderr.log &
185+
```
186+
187+
### Run the Agentscope Example
188+
189+
```plaintext
190+
opentelemetry-instrument \
191+
192+
--exporter_otlp_protocol grpc \
193+
194+
--traces_exporter otlp \
195+
196+
--exporter_otlp_insecure true \
197+
198+
--exporter_otlp_endpoint 127.0.0.1:6666 \
199+
200+
--service_name demo \
201+
202+
python demo.py
203+
```
204+
205+
#### Results
206+
207+
Access the Jaeger UI to view the collected trace data. You should now see trace information being properly received.
208+
209+
![image.png](docs/_assets/img/quickstart-results.png)
14210

15211
## Resoures
16212
* AgentScope: https://github.com/modelscope/agentscope
1.04 MB
Loading

0 commit comments

Comments
 (0)