Skip to content

Commit e12ce70

Browse files
committed
Fixes URLs.
1 parent 2d6327d commit e12ce70

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

explore-analyze/elastic-inference/ml-node-vs-eis.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,11 +13,11 @@ applies_to:
1313

1414
The Elastic Inference Service (EIS) requires zero setup or management. It's always-on, has excellent ingest throughput, and uses simple token-based billing.
1515

16-
Use EIS if you're getting started with [semantic search](./solutions/search/semantic-search.md) or [hybrid search](./solutions/search/hybrid-search-md) and want a smooth experience. Under the hood, EIS uses GPUs for ML {{infer}}, which are more efficient and allow a faster, more cost-effective experience for most usecases.
16+
Use EIS if you're getting started with [semantic search](/solutions/search/semantic-search.md) or [hybrid search](/solutions/search/hybrid-search-md) and want a smooth experience. Under the hood, EIS uses GPUs for ML {{infer}}, which are more efficient and allow a faster, more cost-effective experience for most usecases.
1717

1818
## When to use {{ml}} nodes?
1919

20-
ML nodes are a more configurable solution than EIS where you can set up specific nodes using CPUs to execute [ML {{infer}}]((./explore-analyze/elastic-inference/inference-api.md)). {{ml-cap}} nodes tend to incur higher costs but give more control.
20+
ML nodes are a more configurable solution than EIS where you can set up specific nodes using CPUs to execute [ML {{infer}}]((/explore-analyze/elastic-inference/inference-api.md)). {{ml-cap}} nodes tend to incur higher costs but give more control.
2121

2222
Use ML nodes if you want to decide how your models run, you want to run custom models, or you have a self-managed setup.
2323

0 commit comments

Comments
 (0)