You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/azure-functions/streaming-logs.md
+40-41Lines changed: 40 additions & 41 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,7 @@
1
1
---
2
2
title: Stream execution logs in Azure Functions
3
3
description: Learn how you can stream logs for functions in near real time.
4
-
ms.date: 8/21/2023
4
+
ms.date: 6/16/2025
5
5
ms.topic: how-to
6
6
ms.devlang: azurecli
7
7
ms.custom: devx-track-azurepowershell
@@ -14,77 +14,76 @@ While developing an application, you often want to see what's being written to t
14
14
15
15
There are two ways to view a stream of log files being generated by your function executions.
16
16
17
-
***Live Metrics Stream (recommended)**: when your function app is [connected to Application Insights](configure-monitoring.md#enable-application-insights-integration), you can view log data and other metrics in near real-time in the Azure portal using [Live Metrics Stream](/azure/azure-monitor/app/live-stream). Use this method when monitoring functions running on multiple-instances and supports all plan types. This method uses [sampled data](configure-monitoring.md#configure-sampling).
17
+
## [Live Metrics](#tab/live-metrics)
18
18
19
-
***Built-in log streaming**: the App Service platform lets you view a stream of your application log files. This is equivalent to the output seen when you debug your functions during [local development](functions-develop-local.md) and when you use the **Test** tab in the portal. All log-based information is displayed. For more information, see [Stream logs](../app-service/troubleshoot-diagnostic-logs.md#stream-logs). This streaming method supports only a single instance, and can't be used with an app running on Linux in a Consumption plan. When your function is scaled to multiple instances, data from other instances isn't shown using this method.
19
+
When your function app is [connected to Application Insights](configure-monitoring.md#enable-application-insights-integration), you can view log data and other metrics in near real-time in the Azure portal using [Live Metrics Stream](/azure/azure-monitor/app/live-stream). Use this method when monitoring functions running on multiple-instances and supports all plan types. This method uses [sampled data](configure-monitoring.md#configure-sampling). _This is the recommended way to view streaming logs._
20
20
21
-
Log streams can be viewed both in the portal and in most local development environments.
21
+
>[!IMPORTANT]
22
+
>By default, the Live Metrics stream includes logs from all apps connected to a given Application Insights instance. When you have more than one app sending log data, you should [filter your log stream data](/azure/azure-monitor/app/live-stream#filter-by-server-instance).
22
23
23
-
## [Portal](#tab/azure-portal)
24
+
## [Built-in logs](#tab/built-in)
24
25
25
-
You can view both types of log streams in the portal.
26
+
The App Service platform lets you view a stream of your application log files. This is equivalent to the output seen when you debug your functions during [local development](functions-develop-local.md) and when you use the **Test** tab in the portal. All log-based information is displayed. For more information, see [Stream logs](../app-service/troubleshoot-diagnostic-logs.md#stream-logs). This streaming method supports only a single instance, and can't be used with an app running on Linux in a Consumption plan. When your function is scaled to multiple instances, data from other instances isn't shown using this method.
26
27
27
-
To view streaming logs in the portal, select the **Platform features** tab in your function app. Then, under **Monitoring**, choose **Log streaming**.
28
+
---
28
29
29
-

30
+
Log streams can be viewed both in the portal and in most local development environments. The way that you enable and view streaminglogs depends on your log streaming method, either Live Metrics or built-in.
30
31
31
-
This connects your app to the log streaming service and application logs are displayed in the window. You can toggle between **Application logs** and **Web server logs**.
32
+
## [Azure portal](#tab/azure-portal/live-metrics)
32
33
33
-

34
+
1. To view the Live Metrics Stream for your app, select the **Overview** tab of your function app.
34
35
35
-
To view the Live Metrics Stream for your app, select the **Overview** tab of your function app. When you have Application Insights enabled, you see an **Application Insights** link under **Configured features**. This link takes you to the Application Insights page for your app.
36
+
1. When you have Application Insights enabled, you see an **Application Insights** link under **Configured features**. This link takes you to the Application Insights page for your app.
36
37
37
-
In Application Insights, select **Live Metrics Stream**. [Sampled log entries](configure-monitoring.md#configure-sampling) are displayed under **Sample Telemetry**.
38
+
1.In Application Insights, select **Live Metrics Stream**. [Sampled log entries](configure-monitoring.md#configure-sampling) are displayed under **Sample Telemetry**.
38
39
39
40

40
41
41
-
## [Visual Studio Code](#tab/vs-code)
42
+
## [Visual Studio Code](#tab/vs-code/live-metrics)
42
43
43
-
To turn on the streaming logs for your function app in Azure:
44
+
Run this command in the Terminal to display the Live Metrics Stream in a new browser window:
44
45
45
-
1. Select F1 to open the command palette, and then search for and run the command **Azure Functions: Start Streaming Logs**.
Use the [`func azure functionapp logstream` command](functions-core-tools-reference.md#func-azure-functionapp-list-functions) to start receiving streaming logs of a specific function app running in Azure, as in this example:
60
+
To view streaming logs in the portal, select the **Platform features** tab in your function app. Then, under **Monitoring**, choose **Log streaming**.

60
63
61
-
>[!NOTE]
62
-
>Because built-in log streaming isn't yet enabled for function apps running on Linux in a Consumption plan, you need to instead enable the [Live Metrics Stream](/azure/azure-monitor/app/live-stream) to view the logs in near-real time.
64
+
This connects your app to the log streaming service and application logs are displayed in the window. You can toggle between **Application logs** and **Web server logs**.
63
65
64
-
Use this command to display the Live Metrics Stream in a new browser window.
66
+

To turn on the streaming logs for your function app in Azure:
69
71
70
-
## [Azure CLI](#tab/azure-cli)
72
+
1. Select F1 to open the command palette, and then search for and run the command **Azure Functions: Start Streaming Logs**.
71
73
72
-
You can enable streaming logs by using the [Azure CLI](/cli/azure/install-azure-cli). Use the following commands to sign in, choose your subscription, and stream log files:
74
+
1. Select your function app in Azure, and then select **Yes**to enable application logging for the function app.
73
75
74
-
```azurecli
75
-
az login
76
-
az account list
77
-
az account set --subscription <subscriptionNameOrId>
78
-
az webapp log tail --resource-group <RESOURCE_GROUP_NAME> --name <FUNCTION_APP_NAME>
79
-
```
76
+
1. Trigger your functions in Azure. Notice that log data is displayed in the Output window in Visual Studio Code.
80
77
81
-
## [Azure PowerShell](#tab/azure-powershell)
78
+
1. When you're done, remember to run the command **Azure Functions: Stop Streaming Logs** to disable logging for the function app.
82
79
83
-
You can enable streaming logs by using [Azure PowerShell](/powershell/azure/). For PowerShell, use the [Set-AzWebApp](/powershell/module/az.websites/set-azwebapp) command to enable logging on the function app, as shown in the following snippet:
Use the [`func azure functionapp logstream`](functions-core-tools-reference.md#func-azure-functionapp-list-functions) command to start receiving streaming logs of a specific function app running in Azure, as in this example:
86
83
87
-
For more information, see the [complete code example](../app-service/scripts/powershell-monitor.md#sample-script).
Copy file name to clipboardExpand all lines: articles/azure-netapp-files/configure-ldap-extended-groups.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
---
2
-
title: Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes | Microsoft Docs
2
+
title: Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes
3
3
description: Describes the considerations and steps for enabling LDAP with extended groups when you create an NFS volume by using Azure NetApp Files.
4
4
services: azure-netapp-files
5
5
author: b-hchen
@@ -8,12 +8,12 @@ ms.topic: how-to
8
8
ms.date: 02/21/2025
9
9
ms.author: anfdocs
10
10
---
11
-
# Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes
11
+
# Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS
12
12
13
13
When you [create an NFS volume](azure-netapp-files-create-volumes.md), you can enable the LDAP with extended groups feature (the **LDAP** option) for the volume. This feature enables Active Directory LDAP users and extended groups (up to 1024 groups) to access files and directories in the volume. You can use the LDAP with extended groups feature with both NFSv4.1 and NFSv3 volumes.
14
14
15
15
> [!NOTE]
16
-
> By default, in Active Directory LDAP servers, the `MaxPageSize` attribute is set to a default of 1,000. This setting means that groups beyond 1,000 are truncated in LDAP queries. To enable full support with the 1,024 value for extended groups, the `MaxPageSiz`e attribute must be modified to reflect the 1,024 value. For information about how to change that value, see [How to view and set LDAP policy in Active Directory by using Ntdsutil.exe](/troubleshoot/windows-server/identity/view-set-ldap-policy-using-ntdsutil).
16
+
> By default, in Active Directory LDAP servers, the `MaxPageSize` attribute is set to a default of 1,000. This setting means that groups beyond 1,000 are truncated in LDAP queries. To enable full support with the 1,024 value for extended groups, the `MaxPageSize` attribute must be modified to reflect the 1,024 value. For information about how to change that value, see [How to view and set LDAP policy in Active Directory by using Ntdsutil.exe](/troubleshoot/windows-server/identity/view-set-ldap-policy-using-ntdsutil).
17
17
18
18
Azure NetApp Files supports fetching of extended groups from the LDAP name service rather than from the RPC header. Azure NetApp Files interacts with LDAP by querying for attributes such as usernames, numeric IDs, groups, and group memberships for NFS protocol operations.
Copy file name to clipboardExpand all lines: articles/cost-management-billing/reservations/prepay-databricks-reserved-capacity.md
+4-16Lines changed: 4 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -20,22 +20,10 @@ The prepurchase discount applies only to the DBU usage. Other charges such as co
20
20
21
21
## Determine the right size to buy
22
22
23
-
Databricks prepurchase applies to all Databricks workloads and tiers. You can think of the prepurchase as a pool of prepaid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted in the following ratios:
24
-
25
-
| Workload | DBU application ratio - Standard tier | DBU application ratio - Premium tier |
26
-
| --- | --- | --- |
27
-
| All-purpose compute | 0.4 | 0.55 |
28
-
| Jobs compute | 0.15 | 0.30 |
29
-
| Jobs light compute | 0.07 | 0.22 |
30
-
| SQL compute | N/A | 0.22 |
31
-
| SQL Pro compute | N/A | 0.55 |
32
-
| Serverless SQL | N/A | 0.70 |
33
-
| Serverless real-time inference | N/A | 0.082 |
34
-
| Model training | N/A | 0.65 |
35
-
| Delta Live Tables | NA | 0.30 (core), 0.38 (pro), 0.54 (advanced) |
36
-
| All Purpose Photon | NA | 0.55 |
37
-
38
-
For example, when All-purpose compute – Standard Tier capacity gets consumed, the prepurchased Databricks commit units get deducted by 0.4 units. When Jobs light compute – Standard Tier capacity gets used, the prepurchased Databricks commit unit gets deducted by 0.07 units.
23
+
Databricks prepurchase applies to all Databricks workloads and tiers. You can think of the prepurchase as a pool of prepaid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted from the pool at a ratio matching the DBU list price of the workload. Workload pricing can be found on the [Azure Databricks Pricing Page](https://azure.microsoft.com/pricing/details/databricks/).
24
+
25
+
For example, when All-purpose compute – Premium Tier DBUs are consumed, the prepurchased Databricks commit units get deducted by 0.55 units. When Jobs compute – Premium Tier capacity gets used, the prepurchased Databricks commit unit gets deducted by 0.30 units.
0 commit comments