Skip to content

Commit f530cab

Browse files
authored
Merge pull request #98966 from kraigb/kraigb-edits
Update concurrency to include scaling behavior
2 parents ce0b14e + 1e9e81c commit f530cab

File tree

4 files changed

+40
-16
lines changed

4 files changed

+40
-16
lines changed

articles/azure-functions/functions-best-practices.md

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ title: Best Practices for Azure Functions
33
description: Learn best practices and patterns for Azure Functions.
44
ms.assetid: 9058fb2f-8a93-4036-a921-97a0772f503c
55
ms.topic: conceptual
6-
ms.date: 10/16/2017
6+
ms.date: 12/17/2019
77

88
ms.custom: H1Hack27Feb2017
99

@@ -67,7 +67,7 @@ There are a number of factors that impact how instances of your function app sca
6767

6868
### Share and manage connections
6969

70-
Reuse connections to external resources whenever possible. See [how to manage connections in Azure Functions](./manage-connections.md).
70+
Reuse connections to external resources whenever possible. See [how to manage connections in Azure Functions](./manage-connections.md).
7171

7272
### Avoid sharing storage accounts
7373

@@ -85,10 +85,18 @@ Don't use verbose logging in production code, which has a negative performance i
8585

8686
### Use async code but avoid blocking calls
8787

88-
Asynchronous programming is a recommended best practice. However, always avoid referencing the `Result` property or calling `Wait` method on a `Task` instance. This approach can lead to thread exhaustion.
88+
Asynchronous programming is a recommended best practice, especially when blocking I/O operations are involved.
89+
90+
In C#, always avoid referencing the `Result` property or calling `Wait` method on a `Task` instance. This approach can lead to thread exhaustion.
8991

9092
[!INCLUDE [HTTP client best practices](../../includes/functions-http-client-best-practices.md)]
9193

94+
### Use multiple worker processes
95+
96+
By default, any host instance for Functions uses a single worker process. To improve performance, especially with single-threaded runtimes like Python, use the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) to increase the number of worker processes per host (up to 10). Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
97+
98+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
99+
92100
### Receive messages in batch whenever possible
93101

94102
Some triggers like Event Hub enable receiving a batch of messages on a single invocation. Batching messages has much better performance. You can configure the max batch size in the `host.json` file as detailed in the [host.json reference documentation](functions-host-json.md)

articles/azure-functions/functions-reference-node.md

Lines changed: 11 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ description: Understand how to develop functions by using JavaScript.
44

55
ms.assetid: 45dedd78-3ff9-411f-bb4b-16d29a11384c
66
ms.topic: reference
7-
ms.date: 02/24/2019
7+
ms.date: 12/17/2019
88

99
---
1010
# Azure Functions JavaScript developer guide
@@ -402,6 +402,16 @@ When you work with HTTP triggers, you can access the HTTP request and response o
402402
context.done(null, res);
403403
```
404404

405+
## Scaling and concurrency
406+
407+
By default, Azure Functions automatically monitors the load on your application and creates additional host instances for Node.js as needed. Functions uses built-in (not user configurable) thresholds for different trigger types to decide when to add instances, such the age of messages and queue size for QueueTrigger. For more information, see [How the consumption and premium plans work](functions-scale.md#how-the-consumption-and-premium-plans-work).
408+
409+
This scaling behavior is sufficient for many Node.js applications. For CPU-bound applications, you can improve performance further by using multiple language worker processes.
410+
411+
By default, every Functions host instance has a single language worker process. You can increase the number of worker processes per host (up to 10) by using the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting. Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
412+
413+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
414+
405415
## Node version
406416

407417
The following table shows the Node.js version used by each major version of the Functions runtime:

articles/azure-functions/functions-reference-python.md

Lines changed: 16 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
title: Python developer reference for Azure Functions
33
description: Understand how to develop functions with Python
44
ms.topic: article
5-
ms.date: 04/16/2018
5+
ms.date: 12/13/2019
66
---
77

88
# Azure Functions Python developer guide
@@ -276,28 +276,30 @@ In this function, the value of the `name` query parameter is obtained from the `
276276

277277
Likewise, you can set the `status_code` and `headers` for the response message in the returned [HttpResponse] object.
278278

279-
## Concurrency
279+
## Scaling and concurrency
280280

281-
By default, the Functions Python runtime can only process one invocation of a function at a time. This concurrency level might not be sufficient under one or more of the following conditions:
281+
By default, Azure Functions automatically monitors the load on your application and creates additional host instances for Python as needed. Functions uses built-in (not user configurable) thresholds for different trigger types to decide when to add instances, such the age of messages and queue size for QueueTrigger. For more information, see [How the consumption and premium plans work](functions-scale.md#how-the-consumption-and-premium-plans-work).
282282

283-
+ You're trying to handle a number of invocations being made at the same time.
284-
+ You're processing a large number of I/O events.
285-
+ Your application is I/O bound.
283+
This scaling behavior is sufficient for many applications. Applications with any of the following characteristics, however, may not scale as effectively:
286284

287-
In these situations, you can improve performance by running asynchronously and by using multiple language worker processes.
285+
- The application needs to handle many concurrent invocations.
286+
- The application processes a large number of I/O events.
287+
- The application is I/O bound.
288+
289+
In such cases, you can improve performance further by employing async patterns and by using multiple language worker processes.
288290

289291
### Async
290292

291-
We recommend that you use the `async def` statement to make your function run as an asynchronous coroutine.
293+
Because Python is a single-threaded runtime, a host instance for Python can process only one function invocation at a time. For applications that process a large number of I/O events and/or is I/O bound, you can improve performance by running functions asynchronously.
292294

293-
```python
294-
# Runs with asyncio directly
295+
To run a function asynchronously, use the `async def` statement, which runs the function with [asyncio](https://docs.python.org/3/library/asyncio.html) directly:
295296

297+
```python
296298
async def main():
297299
await some_nonblocking_socket_io_op()
298300
```
299301

300-
When the `main()` function is synchronous (without the `async` qualifier), the function is automatically run in an `asyncio` thread-pool.
302+
A function without the `async` keyword is run automatically run in an asyncio thread-pool:
301303

302304
```python
303305
# Runs in an asyncio thread-pool
@@ -308,7 +310,9 @@ def main():
308310

309311
### Use multiple language worker processes
310312

311-
By default, every Functions host instance has a single language worker process. However there's support to have multiple language worker processes per host instance. Function invocations can then be evenly distributed among these language worker processes. Use the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting to change this value.
313+
By default, every Functions host instance has a single language worker process. You can increase the number of worker processes per host (up to 10) by using the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting. Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
314+
315+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
312316

313317
## Context
314318

articles/azure-functions/functions-scale.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -160,6 +160,8 @@ Different triggers may also have different scaling limits as well as documented
160160

161161
There are many aspects of a function app that will impact how well it will scale, including host configuration, runtime footprint, and resource efficiency. For more information, see the [scalability section of the performance considerations article](functions-best-practices.md#scalability-best-practices). You should also be aware of how connections behave as your function app scales. For more information, see [How to manage connections in Azure Functions](manage-connections.md).
162162

163+
For additional information on scaling in Python and Node.js, see [Azure Functions Python developer guide - Scaling and concurrency](functions-reference-python.md#scaling-and-concurrency) and [Azure Functions Node.js developer guide - Scaling and concurrency](functions-reference-node.md#scaling-and-concurrency).
164+
163165
### Billing model
164166

165167
Billing for the different plans is described in detail on the [Azure Functions pricing page](https://azure.microsoft.com/pricing/details/functions/). Usage is aggregated at the function app level and counts only the time that function code is executed. The following are units for billing:

0 commit comments

Comments
 (0)