Skip to content

Commit fe5ddad

Browse files
authored
Apply suggestions from PCX review
1 parent eb9d415 commit fe5ddad

File tree

2 files changed

+2
-2
lines changed

2 files changed

+2
-2
lines changed

src/content/docs/ai-gateway/index.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -37,7 +37,7 @@ View metrics such as the number of requests, tokens, and the cost it takes to ru
3737

3838
</Feature>
3939

40-
<Feature header="Logging" href="/ai-gateway/observability/analytics/" cta="View Logging">
40+
<Feature header="Logging" href="/ai-gateway/observability/logging/" cta="View Logging">
4141

4242
Gain insight on requests and errors.
4343

src/content/docs/reference-architecture/diagrams/ai/ai-multivendor-observability-control.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,7 +38,7 @@ The following architecture illustrates the setup of [AI Gateway](/ai-gateway/) a
3838
![Figure 1: Multi-vendor AI architecture](~/assets/images/reference-architecture/ai-multivendor-observability-control/ai-multi-vendor-observability-control.svg "Multi-vendor AI architecture")
3939

4040
1. **Inference request**: Send POST request to your AI gateway.
41-
2. **Request proxying**: Forward `POST` request to AI Inference provider or serve response from [cache, if enabled and available](/ai-gateway/configuration/caching). During this process, both [analytics](/ai-gateway/observability/analytics/) and [logs](/ai-gateway/observability/analytics/) are collected. Additionally, controls such as Rate Limiting are enforced.
41+
2. **Request proxying**: Forward `POST` request to AI Inference provider or serve response from [cache, if enabled and available](/ai-gateway/configuration/caching). During this process, both [analytics](/ai-gateway/observability/analytics/) and [logs](/ai-gateway/observability/logging/) are collected. Additionally, controls such as Rate Limiting are enforced.
4242
3. **Error handling**: In case of errors, retry request or fallback to other inference provider, depending on configuration.
4343

4444
## Related resources

0 commit comments

Comments
 (0)