You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-functions/functions-best-practices.md
+11-3Lines changed: 11 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: Best Practices for Azure Functions
3
3
description: Learn best practices and patterns for Azure Functions.
4
4
ms.assetid: 9058fb2f-8a93-4036-a921-97a0772f503c
5
5
ms.topic: conceptual
6
-
ms.date: 10/16/2017
6
+
ms.date: 12/17/2019
7
7
8
8
ms.custom: H1Hack27Feb2017
9
9
@@ -67,7 +67,7 @@ There are a number of factors that impact how instances of your function app sca
67
67
68
68
### Share and manage connections
69
69
70
-
Reuse connections to external resources whenever possible. See [how to manage connections in Azure Functions](./manage-connections.md).
70
+
Reuse connections to external resources whenever possible. See [how to manage connections in Azure Functions](./manage-connections.md).
71
71
72
72
### Avoid sharing storage accounts
73
73
@@ -85,10 +85,18 @@ Don't use verbose logging in production code, which has a negative performance i
85
85
86
86
### Use async code but avoid blocking calls
87
87
88
-
Asynchronous programming is a recommended best practice. However, always avoid referencing the `Result` property or calling `Wait` method on a `Task` instance. This approach can lead to thread exhaustion.
88
+
Asynchronous programming is a recommended best practice, especially when blocking I/O operations are involved.
89
+
90
+
In C#, always avoid referencing the `Result` property or calling `Wait` method on a `Task` instance. This approach can lead to thread exhaustion.
89
91
90
92
[!INCLUDE [HTTP client best practices](../../includes/functions-http-client-best-practices.md)]
91
93
94
+
### Use multiple worker processes
95
+
96
+
By default, any host instance for Functions uses a single worker process. To improve performance, especially with single-threaded runtimes like Python, use the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) to increase the number of worker processes per host (up to 10). Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
97
+
98
+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
99
+
92
100
### Receive messages in batch whenever possible
93
101
94
102
Some triggers like Event Hub enable receiving a batch of messages on a single invocation. Batching messages has much better performance. You can configure the max batch size in the `host.json` file as detailed in the [host.json reference documentation](functions-host-json.md)
Copy file name to clipboardExpand all lines: articles/azure-functions/functions-reference-node.md
+11-1Lines changed: 11 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ description: Understand how to develop functions by using JavaScript.
4
4
5
5
ms.assetid: 45dedd78-3ff9-411f-bb4b-16d29a11384c
6
6
ms.topic: reference
7
-
ms.date: 02/24/2019
7
+
ms.date: 12/17/2019
8
8
9
9
---
10
10
# Azure Functions JavaScript developer guide
@@ -402,6 +402,16 @@ When you work with HTTP triggers, you can access the HTTP request and response o
402
402
context.done(null, res);
403
403
```
404
404
405
+
## Scaling and concurrency
406
+
407
+
By default, Azure Functions automatically monitors the load on your application and creates additional host instances forNode.js as needed. Functions uses built-in (not user configurable) thresholds for different trigger types to decide when to add instances, such the age of messages and queue size forQueueTrigger. For more information, see [How the consumption and premium plans work](functions-scale.md#how-the-consumption-and-premium-plans-work).
408
+
409
+
This scaling behavior is sufficient for many Node.jsapplications. ForCPU-bound applications, you can improve performance further by using multiple language worker processes.
410
+
411
+
By default, every Functions host instance has a single language worker process. You can increase the number of worker processes per host (up to 10) by using the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting. Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
412
+
413
+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
414
+
405
415
## Node version
406
416
407
417
The following table shows the Node.js version used by each major version of the Functions runtime:
Copy file name to clipboardExpand all lines: articles/azure-functions/functions-reference-python.md
+16-12Lines changed: 16 additions & 12 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,7 +2,7 @@
2
2
title: Python developer reference for Azure Functions
3
3
description: Understand how to develop functions with Python
4
4
ms.topic: article
5
-
ms.date: 04/16/2018
5
+
ms.date: 12/13/2019
6
6
---
7
7
8
8
# Azure Functions Python developer guide
@@ -276,28 +276,30 @@ In this function, the value of the `name` query parameter is obtained from the `
276
276
277
277
Likewise, you can set the `status_code` and `headers` for the response message in the returned [HttpResponse] object.
278
278
279
-
## Concurrency
279
+
## Scaling and concurrency
280
280
281
-
By default, the Functions Python runtime can only process one invocation of a function at a time. This concurrency level might not be sufficient under one or more of the following conditions:
281
+
By default, Azure Functions automatically monitors the load on your application and creates additional host instances for Python as needed. Functions uses built-in (not user configurable) thresholds for different trigger types to decide when to add instances, such the age of messages and queue size for QueueTrigger. For more information, see [How the consumption and premium plans work](functions-scale.md#how-the-consumption-and-premium-plans-work).
282
282
283
-
+ You're trying to handle a number of invocations being made at the same time.
284
-
+ You're processing a large number of I/O events.
285
-
+ Your application is I/O bound.
283
+
This scaling behavior is sufficient for many applications. Applications with any of the following characteristics, however, may not scale as effectively:
286
284
287
-
In these situations, you can improve performance by running asynchronously and by using multiple language worker processes.
285
+
- The application needs to handle many concurrent invocations.
286
+
- The application processes a large number of I/O events.
287
+
- The application is I/O bound.
288
+
289
+
In such cases, you can improve performance further by employing async patterns and by using multiple language worker processes.
288
290
289
291
### Async
290
292
291
-
We recommend that you use the `async def` statement to make your function run as an asynchronous coroutine.
293
+
Because Python is a single-threaded runtime, a host instance for Python can process only one function invocation at a time. For applications that process a large number of I/O events and/or is I/O bound, you can improve performance by running functions asynchronously.
292
294
293
-
```python
294
-
# Runs with asyncio directly
295
+
To run a function asynchronously, use the `async def` statement, which runs the function with [asyncio](https://docs.python.org/3/library/asyncio.html) directly:
295
296
297
+
```python
296
298
asyncdefmain():
297
299
await some_nonblocking_socket_io_op()
298
300
```
299
301
300
-
When the `main()`function is synchronous (without the `async`qualifier), the function is automatically run in an `asyncio` thread-pool.
302
+
A function without the `async`keyword is run automatically run in an asyncio thread-pool:
301
303
302
304
```python
303
305
# Runs in an asyncio thread-pool
@@ -308,7 +310,9 @@ def main():
308
310
309
311
### Use multiple language worker processes
310
312
311
-
By default, every Functions host instance has a single language worker process. However there's support to have multiple language worker processes per host instance. Function invocations can then be evenly distributed among these language worker processes. Use the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting to change this value.
313
+
By default, every Functions host instance has a single language worker process. You can increase the number of worker processes per host (up to 10) by using the [FUNCTIONS_WORKER_PROCESS_COUNT](functions-app-settings.md#functions_worker_process_count) application setting. Azure Functions then tries to evenly distribute simultaneous function invocations across these workers.
314
+
315
+
The FUNCTIONS_WORKER_PROCESS_COUNT applies to each host that Functions creates when scaling out your application to meet demand.
Copy file name to clipboardExpand all lines: articles/azure-functions/functions-scale.md
+2Lines changed: 2 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -160,6 +160,8 @@ Different triggers may also have different scaling limits as well as documented
160
160
161
161
There are many aspects of a function app that will impact how well it will scale, including host configuration, runtime footprint, and resource efficiency. For more information, see the [scalability section of the performance considerations article](functions-best-practices.md#scalability-best-practices). You should also be aware of how connections behave as your function app scales. For more information, see [How to manage connections in Azure Functions](manage-connections.md).
162
162
163
+
For additional information on scaling in Python and Node.js, see [Azure Functions Python developer guide - Scaling and concurrency](functions-reference-python.md#scaling-and-concurrency) and [Azure Functions Node.js developer guide - Scaling and concurrency](functions-reference-node.md#scaling-and-concurrency).
164
+
163
165
### Billing model
164
166
165
167
Billing for the different plans is described in detail on the [Azure Functions pricing page](https://azure.microsoft.com/pricing/details/functions/). Usage is aggregated at the function app level and counts only the time that function code is executed. The following are units for billing:
0 commit comments