Skip to content

Commit f3e54f0

Browse files
Acrolinx and last look.
1 parent b167447 commit f3e54f0

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

articles/machine-learning/how-to-use-low-priority-batch.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -18,7 +18,7 @@ ms.custom: devplatv2
1818

1919
[!INCLUDE [cli v2](includes/machine-learning-dev-v2.md)]
2020

21-
Azure Batch Deployments supports low priority virtual machines (VMs) to reduce the cost of batch inference workloads. Low priority virtual machines enable a large amount of compute power to be used for a low cost. Low priority virtual machines take advantage of surplus capacity in Azure. When you specify low priority virtual machines in your pools, Azure can use this surplus, when available.
21+
Azure Batch Deployments supports low priority virtual machines (VMs) to reduce the cost of batch inference workloads. Low priority virtual machines enable a large amount of compute power to be used for a low cost. Low priority virtual machines take advantage of surplus capacity in Azure. When you specify low priority VMs in your pools, Azure can use this surplus, when available.
2222

2323
> [!TIP]
2424
> The tradeoff for using low priority VMs is that those virtual machines might not be available or they might be preempted at any time, depending on available capacity. For this reason, this approach is most suitable for batch and asynchronous processing workloads, where job completion time is flexible and the work is distributed across many virtual machines.
@@ -37,7 +37,7 @@ Azure Machine Learning Batch Deployments provides several capabilities that make
3737

3838
Many batch workloads are a good fit for low priority VMs. Using low priority VMs can introduce execution delays when deallocation of VMs occurs. If you have flexibility in the time jobs have to finish, you might tolerate the potential drops in capacity.
3939

40-
When *deploying models* under batch endpoints, rescheduling can be done at the minibatch level. That approach has the benefit that deallocation only impacts those minibatches that are currently being processed and not finished on the affected node. All completed progress is kept.
40+
When you deploy models under batch endpoints, rescheduling can be done at the minibatch level. That approach has the benefit that deallocation only impacts those minibatches that are currently being processed and not finished on the affected node. All completed progress is kept.
4141

4242
## Creating batch deployments with low priority VMs
4343

0 commit comments

Comments
 (0)