Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 27ff82b

Browse files
authored
Jfinks fix links misc (#118)
* Update sparsezoo.mdx Correcting misc editorial items, fit and finish * Update enterprise.mdx * Update and rename diagnotistics-debugging.mdx to diagnostics-debugging.mdx * Update deepsparse-server.mdx Missing absolute URL; misc punctuation/grammar * Update deepsparse-server.mdx punctuation * Update benchmarking.mdx Removed bolt emoji as it was not playing nice with a right sidebar anchor link
1 parent 1ec7c5f commit 27ff82b

File tree

5 files changed

+14
-16
lines changed

5 files changed

+14
-16
lines changed

src/content/products/deepsparse/enterprise.mdx

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ index: 2000
1212
</h1>
1313
<h3> Sparsity-aware neural network inference engine for GPU-class performance on CPUs </h3>
1414
<div style="display: flex; flex-wrap: wrap">
15-
<a href="https://docs.neuralmagic.com/deepsparse/">
15+
<a href="https://docs.neuralmagic.com/products/deepsparse/">
1616
<img alt="Documentation" src="https://img.shields.io/badge/documentation-darkred?&style=for-the-badge&logo=read-the-docs" height="25" />
1717
</a>
1818
<a href="https://join.slack.com/t/discuss-neuralmagic/shared_invite/zt-q1a1cnvo-YBoICSIw3L1dmQpjBeDurQ/">
@@ -60,7 +60,7 @@ The DeepSparse Engine is available in two editions:
6060

6161
## 🧰 Hardware Support and System Requirements
6262

63-
Review [Supported Hardware for the DeepSparse Engine](user-guide/deepsparse-engine/hardware-support) to understand system requirements.
63+
Review [Supported Hardware for the DeepSparse Engine](/user-guide/deepsparse-engine/hardware-support) to understand system requirements.
6464
The DeepSparse Engine works natively on Linux; Mac and Windows require running Linux in a Docker or virtual machine; it will not run natively on those operating systems.
6565

6666
The DeepSparse Engine is tested on Python 3.7-3.10, ONNX 1.5.0-1.12.0, ONNX opset version 11+, and manylinux compliant.
@@ -276,13 +276,13 @@ Use Case: A workload that might benefit from the elastic scheduler is one in whi
276276

277277
## Resources
278278
#### Libraries
279-
- [DeepSparse](https://docs.neuralmagic.com/deepsparse/)
279+
- [DeepSparse](https://docs.neuralmagic.com/products/deepsparse/)
280280

281-
- [SparseML](https://docs.neuralmagic.com/sparseml/)
281+
- [SparseML](https://docs.neuralmagic.com/products/sparseml/)
282282

283-
- [SparseZoo](https://docs.neuralmagic.com/sparsezoo/)
283+
- [SparseZoo](https://docs.neuralmagic.com/products/sparsezoo/)
284284

285-
- [Sparsify](https://docs.neuralmagic.com/sparsify/)
285+
- [Sparsify](https://docs.neuralmagic.com/products/sparsify/)
286286

287287

288288
#### Versions

src/content/products/sparsezoo.mdx

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ index: 4000
1010
### Neural network model repository for highly sparse and sparse-quantized models with matching sparsification recipes
1111

1212
<p>
13-
<a href="https://docs.neuralmagic.com/sparsezoo">
13+
<a href="https://docs.neuralmagic.com/products/sparsezoo">
1414
<img alt="Documentation" src="https://img.shields.io/badge/documentation-darkred?&style=for-the-badge&logo=read-the-docs" height={25} />
1515
</a>
1616
<a href="https://join.slack.com/t/discuss-neuralmagic/shared_invite/zt-q1a1cnvo-YBoICSIw3L1dmQpjBeDurQ/">
@@ -68,7 +68,7 @@ See the [SparseZoo Installation Page](/get-started/install/sparsezoo) for instal
6868
## Quick Tour
6969

7070
The SparseZoo Python API enables you to search and download sparsified models. Code examples are given below.
71-
We encourage users to load SparseZoo models by copying a stub directly from a [model page]((https://sparsezoo.neuralmagic.com/)).
71+
We encourage users to load SparseZoo models by copying a stub directly from a [model page](https://sparsezoo.neuralmagic.com/).
7272

7373
### Introduction to Model Class Object
7474

@@ -374,13 +374,11 @@ sparsezoo search --domain cv --sub-domain classification \
374374
--architecture resnet_v1 --sub-architecture 50
375375
```
376376

377-
For a more in-depth read, check out [SparseZoo documentation.](https://docs.neuralmagic.com/sparsezoo/)
378-
379377
## Resources
380378

381379
### Learning More
382380

383-
- Documentation: [SparseML,](https://docs.neuralmagic.com/sparseml/) [SparseZoo,](https://docs.neuralmagic.com/sparsezoo/) [Sparsify,](https://docs.neuralmagic.com/sparsify/) [DeepSparse](https://docs.neuralmagic.com/deepsparse/)
381+
- Documentation: [SparseML,](https://docs.neuralmagic.com/products/sparseml/) [SparseZoo,](https://docs.neuralmagic.com/products/sparsezoo/) [Sparsify,](https://docs.neuralmagic.com/products/sparsify/) [DeepSparse](https://docs.neuralmagic.com/products/deepsparse/)
384382
- Neural Magic: [Blog,](https://www.neuralmagic.com/blog/) [Resources](https://www.neuralmagic.com/resources/)
385383

386384
### Release History

src/content/use-cases/deploying-deepsparse/deepsparse-server.mdx

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ index: 1000
77

88
# Deploying with the DeepSparse Server
99

10-
This section explains how to deploy with DeepSparse Server
10+
This section explains how to deploy with DeepSparse Server.
1111

1212
## Installation Requirements
1313

@@ -18,7 +18,7 @@ This section requires the [DeepSparse Server Install](/get-started/install/deeps
1818
The DeepSparse Server allows you to serve models and `Pipelines` for deployment in HTTP. The server runs on top of the popular FastAPI web framework and Uvicorn web server.
1919
The server supports any task from DeepSparse, such as `Pipelines` including NLP, image classification, and object detection tasks.
2020
An updated list of available tasks can be found
21-
[here](https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/PIPELINES.md)
21+
[on the DeepSparse Pipelines Introduction](https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/PIPELINES.md).
2222

2323
Run the help CLI to lookup the available arguments.
2424

@@ -146,4 +146,4 @@ All you need is to add `/docs` at the end of your host URL:
146146

147147
localhost:5543/docs
148148

149-
![alt text](./img/swagger_ui.png)
149+
![Swagger UI For Viewing Model Pipeline](https://github.com/neuralmagic/deepsparse/blob/main/src/deepsparse/server/img/swagger_ui.png)

src/content/user-guide/deepsparse-engine/benchmarking.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -113,7 +113,7 @@ To run a sparse quantized INT8 6-layer BERT at batch size 1 for latency:
113113
deepsparse.benchmark zoo:nlp/question_answering/bert-base/pytorch/huggingface/squad/pruned_quant_6layers-aggressive_96 --batch_size 1 --scenario sync
114114
```
115115

116-
## Inference Scenarios
116+
## Inference Scenarios
117117

118118
### Synchronous (Single-stream) Scenario
119119

src/content/user-guide/deepsparse-engine/diagnotistics-debugging.mdx renamed to src/content/user-guide/deepsparse-engine/diagnostics-debugging.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -185,7 +185,7 @@ Locating `== NM Execution Provider supports` shows how many subgraphs we compil
185185
186186
### Full Example Log, Verbose Level = diagnose
187187
188-
The following is an example log with `NM_LOGGING_LEVEL=diagnose` running a super_resolution network, where we only support running 70% of it. Different portions of the log are explained in [Parsing an Example Log.](/user-guide/deepsparse-engine/diagnotistics-debugging#parsing-an-example-log)
188+
The following is an example log with `NM_LOGGING_LEVEL=diagnose` running a super_resolution network, where we only support running 70% of it. Different portions of the log are explained in [Parsing an Example Log.](/user-guide/deepsparse-engine/diagnostics-debugging#parsing-an-example-log)
189189
190190
```text
191191
onnx_filename : test-models/cv-resolution/super_resolution/none-bsd300-onnx-repo/model.onnx

0 commit comments

Comments
 (0)