Skip to content

Commit efd5e35

Browse files
committed
docs(platform): describe trace-based analytics and redaction
1 parent bbfa45e commit efd5e35

File tree

1 file changed

+10
-7
lines changed

1 file changed

+10
-7
lines changed

docs/platform/overview.md

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -19,19 +19,19 @@ The managed platform solves these problems:
1919

2020
- **Secure Key Vault**: Your provider API keys are encrypted client-side before storage—we never see your raw keys
2121
- **Single Virtual Key**: One `ANY_LLM_KEY` works across all providers
22-
- **Usage Analytics**: Track tokens, costs, and performance metrics without logging prompts or responses
22+
- **Trace Analytics**: Track tokens, costs, and performance metrics without logging prompts or responses
2323
- **Zero Infrastructure**: No servers to deploy, no databases to manage
2424

2525
## How it works
2626

27-
The managed platform acts as a secure credential manager and usage tracker. Here's the flow:
27+
The managed platform acts as a secure credential manager and trace-based usage tracker. Here's the flow:
2828

2929
1. **You add provider keys** to the platform dashboard (keys are encrypted in your browser before upload)
3030
2. **You get a virtual key** (`ANY_LLM_KEY`) that represents your project
3131
3. **Your application** uses the `PlatformProvider` with your virtual key
3232
4. **The SDK** authenticates with the platform, retrieves and decrypts your provider key client-side
3333
5. **Your request** goes directly to the LLM provider (OpenAI, Anthropic, etc.)
34-
6. **Usage metadata** (tokens, model, latency) is reported back—never your prompts or responses
34+
6. **OpenTelemetry spans produced during each platform-provider call** are reported back for analytics, with prompt/response content attributes redacted before export
3535

3636
```
3737
┌─────────────────────────────────────────────────────────────────────────┐
@@ -49,15 +49,15 @@ The managed platform acts as a secure credential manager and usage tracker. Here
4949
│ 2. Receive encrypted provider key │
5050
│ 3. Decrypt provider key locally (client-side) │
5151
│ 4. Make request directly to provider │
52-
│ 5. Report usage metadata (tokens, latency) to platform
52+
│ 5. Report in-scope OTel spans (with content redaction) to platform │
5353
└────────────────┬─────────────────────────────────────┬──────────────────┘
5454
│ │
5555
▼ ▼
5656
┌─────────────────────────────┐ ┌────────────────────────────────────┐
5757
│ any-llm Managed Platform │ │ LLM Provider │
5858
│ │ │ (OpenAI, Anthropic, etc.) │
5959
│ • Encrypted key storage │ │ │
60-
│ • Usage tracking │ │ Your prompts/responses go │
60+
│ • Trace tracking │ │ Your prompts/responses go │
6161
│ • Cost analytics │ │ directly here—never through │
6262
│ • Performance metrics │ │ our platform │
6363
└─────────────────────────────┘ └────────────────────────────────────┘
@@ -74,23 +74,26 @@ Your provider API keys are encrypted in your browser using XChaCha20-Poly1305 be
7474
- You maintain full control over your credentials
7575

7676

77-
### Privacy-First Usage Tracking
77+
### Privacy-First Trace Tracking
7878

79-
The platform tracks usage metadata to provide cost and performance insights:
79+
The platform tracks OpenTelemetry span data generated during each platform-provider request to provide cost and performance insights:
8080

8181
**What we track for you:**
8282

8383
- Token counts (input and output)
8484
- Model name and provider
8585
- Request timestamps
8686
- Performance metrics (latency, throughput)
87+
- Additional OpenTelemetry span attributes/events emitted in the same request scope
8788

8889
**What we never track:**
8990

9091
- Your prompts
9192
- Model responses
9293
- Any content from your conversations
9394

95+
Prompt/response payload attributes are removed from traces before export.
96+
9497
### Project Organization
9598

9699
Organize your usage by project, team, or environment:

0 commit comments

Comments
 (0)