You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/platform/overview.md
+10-7Lines changed: 10 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -19,19 +19,19 @@ The managed platform solves these problems:
19
19
20
20
-**Secure Key Vault**: Your provider API keys are encrypted client-side before storage—we never see your raw keys
21
21
-**Single Virtual Key**: One `ANY_LLM_KEY` works across all providers
22
-
-**Usage Analytics**: Track tokens, costs, and performance metrics without logging prompts or responses
22
+
-**Trace Analytics**: Track tokens, costs, and performance metrics without logging prompts or responses
23
23
-**Zero Infrastructure**: No servers to deploy, no databases to manage
24
24
25
25
## How it works
26
26
27
-
The managed platform acts as a secure credential manager and usage tracker. Here's the flow:
27
+
The managed platform acts as a secure credential manager and trace-based usage tracker. Here's the flow:
28
28
29
29
1.**You add provider keys** to the platform dashboard (keys are encrypted in your browser before upload)
30
30
2.**You get a virtual key** (`ANY_LLM_KEY`) that represents your project
31
31
3.**Your application** uses the `PlatformProvider` with your virtual key
32
32
4.**The SDK** authenticates with the platform, retrieves and decrypts your provider key client-side
33
33
5.**Your request** goes directly to the LLM provider (OpenAI, Anthropic, etc.)
34
-
6.**Usage metadata** (tokens, model, latency) is reported back—never your prompts or responses
34
+
6.**OpenTelemetry spans produced during each platform-provider call** are reported back for analytics, with prompt/response content attributes redacted before export
0 commit comments