Skip to content

Commit f3260ea

Browse files
Merge pull request #301814 from MicrosoftDocs/main
Merged by Learn.Build PR Management system
2 parents 8bd7f69 + 1edba8e commit f3260ea

18 files changed

+115
-128
lines changed

articles/azure-functions/streaming-logs.md

Lines changed: 40 additions & 41 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: Stream execution logs in Azure Functions
33
description: Learn how you can stream logs for functions in near real time.
4-
ms.date: 8/21/2023
4+
ms.date: 6/16/2025
55
ms.topic: how-to
66
ms.devlang: azurecli
77
ms.custom: devx-track-azurepowershell
@@ -14,77 +14,76 @@ While developing an application, you often want to see what's being written to t
1414

1515
There are two ways to view a stream of log files being generated by your function executions.
1616

17-
* **Live Metrics Stream (recommended)**: when your function app is [connected to Application Insights](configure-monitoring.md#enable-application-insights-integration), you can view log data and other metrics in near real-time in the Azure portal using [Live Metrics Stream](/azure/azure-monitor/app/live-stream). Use this method when monitoring functions running on multiple-instances and supports all plan types. This method uses [sampled data](configure-monitoring.md#configure-sampling).
17+
## [Live Metrics](#tab/live-metrics)
1818

19-
* **Built-in log streaming**: the App Service platform lets you view a stream of your application log files. This is equivalent to the output seen when you debug your functions during [local development](functions-develop-local.md) and when you use the **Test** tab in the portal. All log-based information is displayed. For more information, see [Stream logs](../app-service/troubleshoot-diagnostic-logs.md#stream-logs). This streaming method supports only a single instance, and can't be used with an app running on Linux in a Consumption plan. When your function is scaled to multiple instances, data from other instances isn't shown using this method.
19+
When your function app is [connected to Application Insights](configure-monitoring.md#enable-application-insights-integration), you can view log data and other metrics in near real-time in the Azure portal using [Live Metrics Stream](/azure/azure-monitor/app/live-stream). Use this method when monitoring functions running on multiple-instances and supports all plan types. This method uses [sampled data](configure-monitoring.md#configure-sampling). _This is the recommended way to view streaming logs._
2020

21-
Log streams can be viewed both in the portal and in most local development environments.
21+
>[!IMPORTANT]
22+
>By default, the Live Metrics stream includes logs from all apps connected to a given Application Insights instance. When you have more than one app sending log data, you should [filter your log stream data](/azure/azure-monitor/app/live-stream#filter-by-server-instance).
2223
23-
## [Portal](#tab/azure-portal)
24+
## [Built-in logs](#tab/built-in)
2425

25-
You can view both types of log streams in the portal.
26+
The App Service platform lets you view a stream of your application log files. This is equivalent to the output seen when you debug your functions during [local development](functions-develop-local.md) and when you use the **Test** tab in the portal. All log-based information is displayed. For more information, see [Stream logs](../app-service/troubleshoot-diagnostic-logs.md#stream-logs). This streaming method supports only a single instance, and can't be used with an app running on Linux in a Consumption plan. When your function is scaled to multiple instances, data from other instances isn't shown using this method.
2627

27-
To view streaming logs in the portal, select the **Platform features** tab in your function app. Then, under **Monitoring**, choose **Log streaming**.
28+
---
2829

29-
![Enable streaming logs in the portal](./media/functions-monitoring/enable-streaming-logs-portal.png)
30+
Log streams can be viewed both in the portal and in most local development environments. The way that you enable and view streaming logs depends on your log streaming method, either Live Metrics or built-in.
3031

31-
This connects your app to the log streaming service and application logs are displayed in the window. You can toggle between **Application logs** and **Web server logs**.
32+
## [Azure portal](#tab/azure-portal/live-metrics)
3233

33-
![View streaming logs in the portal](./media/functions-monitoring/streaming-logs-window.png)
34+
1. To view the Live Metrics Stream for your app, select the **Overview** tab of your function app.
3435

35-
To view the Live Metrics Stream for your app, select the **Overview** tab of your function app. When you have Application Insights enabled, you see an **Application Insights** link under **Configured features**. This link takes you to the Application Insights page for your app.
36+
1. When you have Application Insights enabled, you see an **Application Insights** link under **Configured features**. This link takes you to the Application Insights page for your app.
3637

37-
In Application Insights, select **Live Metrics Stream**. [Sampled log entries](configure-monitoring.md#configure-sampling) are displayed under **Sample Telemetry**.
38+
1. In Application Insights, select **Live Metrics Stream**. [Sampled log entries](configure-monitoring.md#configure-sampling) are displayed under **Sample Telemetry**.
3839

3940
![View Live Metrics Stream in the portal](./media/functions-monitoring/live-metrics-stream.png)
4041

41-
## [Visual Studio Code](#tab/vs-code)
42+
## [Visual Studio Code](#tab/vs-code/live-metrics)
4243

43-
To turn on the streaming logs for your function app in Azure:
44+
Run this command in the Terminal to display the Live Metrics Stream in a new browser window:
4445

45-
1. Select F1 to open the command palette, and then search for and run the command **Azure Functions: Start Streaming Logs**.
46+
```bash
47+
func azure functionapp logstream <FunctionAppName> --browser
48+
```
4649

47-
1. Select your function app in Azure, and then select **Yes** to enable application logging for the function app.
50+
## [Core Tools](#tab/core-tools/live-metrics)
4851

49-
1. Trigger your functions in Azure. Notice that log data is displayed in the Output window in Visual Studio Code.
52+
Use this command to display the Live Metrics Stream in a new browser window:
5053

51-
1. When you're done, remember to run the command **Azure Functions: Stop Streaming Logs** to disable logging for the function app.
54+
```bash
55+
func azure functionapp logstream <FunctionAppName> --browser
56+
```
5257

53-
## [Core Tools](#tab/core-tools)
58+
## [Azure portal](#tab/azure-portal/built-in)
5459

55-
Use the [`func azure functionapp logstream` command](functions-core-tools-reference.md#func-azure-functionapp-list-functions) to start receiving streaming logs of a specific function app running in Azure, as in this example:
60+
To view streaming logs in the portal, select the **Platform features** tab in your function app. Then, under **Monitoring**, choose **Log streaming**.
5661

57-
```bash
58-
func azure functionapp logstream <FunctionAppName>
59-
```
62+
![Enable streaming logs in the portal](./media/functions-monitoring/enable-streaming-logs-portal.png)
6063

61-
>[!NOTE]
62-
>Because built-in log streaming isn't yet enabled for function apps running on Linux in a Consumption plan, you need to instead enable the [Live Metrics Stream](/azure/azure-monitor/app/live-stream) to view the logs in near-real time.
64+
This connects your app to the log streaming service and application logs are displayed in the window. You can toggle between **Application logs** and **Web server logs**.
6365

64-
Use this command to display the Live Metrics Stream in a new browser window.
66+
![View streaming logs in the portal](./media/functions-monitoring/streaming-logs-window.png)
6567

66-
```bash
67-
func azure functionapp logstream <FunctionAppName> --browser
68-
```
68+
## [Visual Studio Code](#tab/vs-code/built-in)
69+
70+
To turn on the streaming logs for your function app in Azure:
6971

70-
## [Azure CLI](#tab/azure-cli)
72+
1. Select F1 to open the command palette, and then search for and run the command **Azure Functions: Start Streaming Logs**.
7173

72-
You can enable streaming logs by using the [Azure CLI](/cli/azure/install-azure-cli). Use the following commands to sign in, choose your subscription, and stream log files:
74+
1. Select your function app in Azure, and then select **Yes** to enable application logging for the function app.
7375

74-
```azurecli
75-
az login
76-
az account list
77-
az account set --subscription <subscriptionNameOrId>
78-
az webapp log tail --resource-group <RESOURCE_GROUP_NAME> --name <FUNCTION_APP_NAME>
79-
```
76+
1. Trigger your functions in Azure. Notice that log data is displayed in the Output window in Visual Studio Code.
8077

81-
## [Azure PowerShell](#tab/azure-powershell)
78+
1. When you're done, remember to run the command **Azure Functions: Stop Streaming Logs** to disable logging for the function app.
8279

83-
You can enable streaming logs by using [Azure PowerShell](/powershell/azure/). For PowerShell, use the [Set-AzWebApp](/powershell/module/az.websites/set-azwebapp) command to enable logging on the function app, as shown in the following snippet:
80+
## [Core Tools](#tab/core-tools/built-in)
8481

85-
:::code language="powershell" source="~/powershell_scripts/app-service/monitor-with-logs/monitor-with-logs.ps1" range="19-20":::
82+
Use the [`func azure functionapp logstream`](functions-core-tools-reference.md#func-azure-functionapp-list-functions) command to start receiving streaming logs of a specific function app running in Azure, as in this example:
8683

87-
For more information, see the [complete code example](../app-service/scripts/powershell-monitor.md#sample-script).
84+
```bash
85+
func azure functionapp logstream <FunctionAppName>
86+
```
8887

8988
---
9089

articles/azure-netapp-files/configure-ldap-extended-groups.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
---
2-
title: Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes | Microsoft Docs
2+
title: Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes
33
description: Describes the considerations and steps for enabling LDAP with extended groups when you create an NFS volume by using Azure NetApp Files.
44
services: azure-netapp-files
55
author: b-hchen
@@ -8,12 +8,12 @@ ms.topic: how-to
88
ms.date: 02/21/2025
99
ms.author: anfdocs
1010
---
11-
# Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS volumes
11+
# Enable Active Directory Domain Services (AD DS) LDAP authentication for NFS
1212

1313
When you [create an NFS volume](azure-netapp-files-create-volumes.md), you can enable the LDAP with extended groups feature (the **LDAP** option) for the volume. This feature enables Active Directory LDAP users and extended groups (up to 1024 groups) to access files and directories in the volume. You can use the LDAP with extended groups feature with both NFSv4.1 and NFSv3 volumes.
1414

1515
> [!NOTE]
16-
> By default, in Active Directory LDAP servers, the `MaxPageSize` attribute is set to a default of 1,000. This setting means that groups beyond 1,000 are truncated in LDAP queries. To enable full support with the 1,024 value for extended groups, the `MaxPageSiz`e attribute must be modified to reflect the 1,024 value. For information about how to change that value, see [How to view and set LDAP policy in Active Directory by using Ntdsutil.exe](/troubleshoot/windows-server/identity/view-set-ldap-policy-using-ntdsutil).
16+
> By default, in Active Directory LDAP servers, the `MaxPageSize` attribute is set to a default of 1,000. This setting means that groups beyond 1,000 are truncated in LDAP queries. To enable full support with the 1,024 value for extended groups, the `MaxPageSize` attribute must be modified to reflect the 1,024 value. For information about how to change that value, see [How to view and set LDAP policy in Active Directory by using Ntdsutil.exe](/troubleshoot/windows-server/identity/view-set-ldap-policy-using-ntdsutil).
1717
1818
Azure NetApp Files supports fetching of extended groups from the LDAP name service rather than from the RPC header. Azure NetApp Files interacts with LDAP by querying for attributes such as usernames, numeric IDs, groups, and group memberships for NFS protocol operations.
1919

articles/azure-netapp-files/includes/netlogon-april-2023.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,8 @@
11
---
2-
title: include file
3-
description: include file
42
author: b-ahibbard
53
ms.service: azure-netapp-files
64
ms.topic: include
7-
ms.date: 04/06/2023
5+
ms.date: 12/06/2024
86
ms.author: anfdocs
97
ms.custom: include file
108

articles/azure-netapp-files/network-attached-file-permissions-nfs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ services: azure-netapp-files
55
author: b-ahibbard
66
ms.service: azure-netapp-files
77
ms.topic: concept-article
8-
ms.date: 11/13/2023
8+
ms.date: 02/13/2025
99
ms.author: anfdocs
1010
---
1111

articles/cost-management-billing/reservations/prepay-databricks-reserved-capacity.md

Lines changed: 4 additions & 16 deletions
Original file line numberDiff line numberDiff line change
@@ -20,22 +20,10 @@ The prepurchase discount applies only to the DBU usage. Other charges such as co
2020

2121
## Determine the right size to buy
2222

23-
Databricks prepurchase applies to all Databricks workloads and tiers. You can think of the prepurchase as a pool of prepaid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted in the following ratios:
24-
25-
| Workload | DBU application ratio - Standard tier | DBU application ratio - Premium tier |
26-
| --- | --- | --- |
27-
| All-purpose compute | 0.4 | 0.55 |
28-
| Jobs compute | 0.15 | 0.30 |
29-
| Jobs light compute | 0.07 | 0.22 |
30-
| SQL compute | N/A | 0.22 |
31-
| SQL Pro compute | N/A | 0.55 |
32-
| Serverless SQL | N/A | 0.70 |
33-
| Serverless real-time inference | N/A | 0.082 |
34-
| Model training | N/A | 0.65 |
35-
| Delta Live Tables | NA | 0.30 (core), 0.38 (pro), 0.54 (advanced) |
36-
| All Purpose Photon | NA | 0.55 |
37-
38-
For example, when All-purpose compute – Standard Tier capacity gets consumed, the prepurchased Databricks commit units get deducted by 0.4 units. When Jobs light compute – Standard Tier capacity gets used, the prepurchased Databricks commit unit gets deducted by 0.07 units.
23+
Databricks prepurchase applies to all Databricks workloads and tiers. You can think of the prepurchase as a pool of prepaid Databricks commit units. Usage is deducted from the pool, regardless of the workload or tier. Usage is deducted from the pool at a ratio matching the DBU list price of the workload. Workload pricing can be found on the [Azure Databricks Pricing Page](https://azure.microsoft.com/pricing/details/databricks/).
24+
25+
For example, when All-purpose compute – Premium Tier DBUs are consumed, the prepurchased Databricks commit units get deducted by 0.55 units. When Jobs compute – Premium Tier capacity gets used, the prepurchased Databricks commit unit gets deducted by 0.30 units.
26+
3927

4028
>[!NOTE]
4129
> Enabling Photon increases the DBU count.

0 commit comments

Comments
 (0)