You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this article, learn how you can use Azure Data Lake Tools for Visual Studio Code (VS Code) to create, test, and run U-SQL scripts. The information is also covered in the following video:
13
15
14
16

@@ -131,7 +133,7 @@ You can set the default context to apply this setting to all script files if you
131
133
2. Enter **ADL: Set Git Ignore**.
132
134
133
135
- If you don’t have a **.gitIgnore** file in your VS Code working folder, a file named **.gitIgnore** is created in your folder. Four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) are added in the file by default. You can make more updates if needed.
134
-
- If you already have a **.gitIgnore** file in your VS Code working folder, the tool adds four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) in your **.gitIgnore** file if the four items were not included in the file.
136
+
- If you already have a **.gitIgnore** file in your VS Code working folder, the tool adds four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) in your **.gitIgnore** file if the four items weren't included in the file.
135
137
136
138

137
139
@@ -214,7 +216,7 @@ U-SQL local run tests your local data and validates your script locally before y
214
216
- Step through the code.
215
217
- Validate your script locally.
216
218
217
-
The local run and local debug feature only works in Windows environments, and is not supported on macOS and Linux-based operating systems.
219
+
The local run and local debug feature only works in Windows environments, and isn't supported on macOS and Linux-based operating systems.
218
220
219
221
For instructions on local run and local debug, see [U-SQL local run and local debug with Visual Studio Code](data-lake-tools-for-vscode-local-run-and-debug.md).
220
222
@@ -228,7 +230,7 @@ Before you can compile and run U-SQL scripts in Data Lake Analytics, you must co
228
230
229
231
1. Select Ctrl+Shift+P to open the command palette.
230
232
231
-
2. Enter **ADL: Login**. The login information appears on the lower right.
233
+
2. Enter **ADL: Login**. The sign in information appears on the lower right.
232
234
233
235

234
236
@@ -268,7 +270,7 @@ You can create an extraction script for .csv, .tsv, and .txt files by using the
268
270
269
271

270
272
271
-
The extraction script is generated based on your entries. For a script that cannot detect the columns, choose one from the two options. If not, only one script will be generated.
273
+
The extraction script is generated based on your entries. For a script that can't detect the columns, choose one from the two options. If not, only one script will be generated.
272
274
273
275

Azure Data Lake and Stream Analytics Tools include functionality related to two Azure services, Azure Data Lake Analytics and Azure Stream Analytics. For more information about the Azure Stream Analytics scenarios, see [Azure Stream Analytics tools for Visual Studio](../stream-analytics/stream-analytics-tools-for-visual-studio-install.md).
15
17
16
18
This article describes how to use Visual Studio to create Azure Data Lake Analytics accounts. You can define jobs in [U-SQL](data-lake-analytics-u-sql-get-started.md), and submit jobs to the Data Lake Analytics service. For more information about Data Lake Analytics, see [Azure Data Lake Analytics overview](data-lake-analytics-overview.md).
Copy file name to clipboardExpand all lines: articles/data-lake-analytics/data-lake-analytics-diagnostic-logs.md
+16-13Lines changed: 16 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@ ms.service: data-lake-analytics
5
5
6
6
7
7
ms.topic: how-to
8
-
ms.date: 02/12/2018
8
+
ms.date: 10/14/2022
9
9
---
10
10
# Accessing diagnostic logs for Azure Data Lake Analytics
11
11
@@ -19,33 +19,36 @@ Diagnostic logging allows you to collect data access audit trails. These logs pr
19
19
20
20
1. Sign on to the [Azure portal](https://portal.azure.com).
21
21
22
-
2. Open your Data Lake Analytics account and select **Diagnostic logs** from the __Monitor__ section. Next, select __Turn on diagnostics__.
22
+
2. Open your Data Lake Analytics account and select **Diagnostic settings** from the **Monitoring** section. Next, select **+ Add diagnostic setting**.
23
23
24
-

24
+

25
25
26
-
3. From __Diagnostics settings__, enter a __Name__ for this logging configuration and then select logging options.
26
+
3. From **Diagnostics setting**, enter a name for this logging configuration and then select logging options.
27
27
28
-

28
+

29
29
30
-
* You can choose to store/process the data in three different ways.
30
+
* You can choose to store/process the data in four different ways.
31
31
32
-
* Select __Archive to a storage account__ to store logs in an Azure storage account. Use this option if you want to archive the data. If you select this option, you must provide an Azure storage account to save the logs to.
32
+
* Select **Archive to a storage account** to store logs in an Azure storage account. Use this option if you want to archive the data. If you select this option, you must provide an Azure storage account to save the logs to.
33
33
34
-
* Select **Stream to an Event Hub** to stream log data to an Azure Event Hub. Use this option if you have a downstream processing pipeline that is analyzing incoming logs in real time. If you select this option, you must provide the details for the Azure Event Hub you want to use.
34
+
* Select **Stream to an event hub** to stream log data to an Azure Event Hub. Use this option if you have a downstream processing pipeline that is analyzing incoming logs in real time. If you select this option, you must provide the details for the Azure Event Hub you want to use.
35
+
36
+
* Select **Send to Log Analytics workspace** to send the data to the Azure Monitor service. Use this option if you want to use Azure Monitor logs to gather and analyze logs.
37
+
38
+
* Select **send to partner solution** if you want to use our partner integration. For more information, you can [follow this link.](../partner-solutions/overview.md)
35
39
36
-
* Select __Send to Log Analytics__ to send the data to the Azure Monitor service. Use this option if you want to use Azure Monitor logs to gather and analyze logs.
37
40
* Specify whether you want to get audit logs or request logs or both. A request log captures every API request. An audit log records all operations that are triggered by that API request.
38
41
39
-
* For __Archive to a storage account__, specify the number of days to retain the data.
42
+
* For **Archive to a storage account**, specify the number of days to retain the data.
40
43
41
-
*Click __Save__.
44
+
*Select **Save**.
42
45
43
46
> [!NOTE]
44
-
> You must select either __Archive to a storage account__, __Stream to an Event Hub__or __Send to Log Analytics__ before clicking the __Save__ button.
47
+
> You must select either **Archive to a storage account**, **Stream to an Event Hub**, **Send to Log Analytics workspace**, or **Send to partner solution** before selecting the **Save** button.
45
48
46
49
### Use the Azure Storage account that contains log data
47
50
48
-
1. To display the blob containers that hold logging data, open the Azure Storage account used for Data Lake Analytics for logging, and then click __Blobs__.
51
+
1. To display the blob containers that hold logging data, open the Azure Storage account used for Data Lake Analytics for logging, and then select **Containers**.
49
52
50
53
* The container **insights-logs-audit** contains the audit logs.
51
54
* The container **insights-logs-requests** contains the request logs.
This article describes how to use the Azure CLI command-line interface to create Azure Data Lake Analytics accounts, submit USQL jobs, and catalogs. The job reads a tab separated values (TSV) file and converts it into a comma-separated values (CSV) file.
This article describes how to use the Azure portal to create Azure Data Lake Analytics accounts, define jobs in [U-SQL](data-lake-analytics-u-sql-get-started.md), and submit jobs to the Data Lake Analytics service.
14
16
15
17
## Prerequisites
16
18
17
-
Before you begin this tutorial, you must have an **Azure subscription**. See [Get Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
19
+
Before you begin this tutorial, you must have an **Azure subscription**. If you don't, you can follow this link to [get an Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
18
20
19
21
## Create a Data Lake Analytics account
20
22
21
-
Now, you will create a Data Lake Analytics and an Azure Data Lake Storage Gen1 account at the same time. This step is simple and only takes about 60 seconds to finish.
23
+
Now, you'll create a Data Lake Analytics and an Azure Data Lake Storage Gen1 account at the same time. This step is simple and only takes about 60 seconds to finish.
22
24
23
25
1. Sign on to the [Azure portal](https://portal.azure.com).
24
-
2. Click**Create a resource** > **Data + Analytics** >**Data Lake Analytics**.
25
-
3. Select values for the following items:
26
+
1. Select**Create a resource**, and in the search at the top of the page enter**Data Lake Analytics**.
27
+
1. Select values for the following items:
26
28
***Name**: Name your Data Lake Analytics account (Only lower case letters and numbers allowed).
27
29
***Subscription**: Choose the Azure subscription used for the Analytics account.
28
30
***Resource Group**. Select an existing Azure Resource Group or create a new one.
29
31
***Location**. Select an Azure data center for the Data Lake Analytics account.
30
-
***Data Lake Storage Gen1**: Follow the instruction to create a new Data Lake Storage Gen1 account, or select an existing one.
31
-
4. Optionally, select a pricing tier for your Data Lake Analytics account.
32
-
5. Click**Create**.
32
+
***Data Lake Storage Gen1**: Follow the instruction to create a new Data Lake Storage Gen1 account, or select an existing one.
33
+
1. Optionally, select a pricing tier for your Data Lake Analytics account.
34
+
1. Select**Create**.
33
35
34
36
35
37
## Your first U-SQL script
36
38
37
-
The following text is a very simple U-SQL script. All it does is define a small dataset within the script and then write that dataset out to the default Data Lake Storage Gen1 account as a file called `/data.csv`.
39
+
The following text is a simple U-SQL script. All it does is define a small dataset within the script and then write that dataset out to the default Data Lake Storage Gen1 account as a file called `/data.csv`.
Learn how to use Azure PowerShell to create Azure Data Lake Analytics accounts and then submit and run U-SQL jobs. For more information about Data Lake Analytics, see [Azure Data Lake Analytics overview](data-lake-analytics-overview.md).
description: Data Lake Analytics lets you drive you business using insights gained in your cloud data at any scale.
3
+
description: Data Lake Analytics lets you drive your business using insights gained in your cloud data at any scale.
4
4
author: saveenr
5
5
ms.author: saveenr
6
6
7
-
ms.reviewer: jasonwhowell
7
+
ms.reviewer: whhender
8
8
ms.service: data-lake-analytics
9
9
ms.topic: overview
10
-
ms.date: 06/23/2017
10
+
ms.date: 10/17/2022
11
11
---
12
12
# What is Azure Data Lake Analytics?
13
13
14
-
Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Instead of deploying, configuring, and tuning hardware, you write queries to transform your data and extract valuable insights. The analytics service can handle jobs of any scale instantly by setting the dial for how much power you need. You only pay for your job when it is running, making it cost-effective.
14
+
Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Instead of deploying, configuring, and tuning hardware, you write queries to transform your data and extract valuable insights. The analytics service can handle jobs of any scale instantly by setting the dial for how much power you need. You only pay for your job when it's running, making it cost-effective.
15
15
16
16
## Azure Data Lake analytics recent update information
17
17
18
-
Azure Data Lake analytics service is updated on an aperiodic basis for certain purpose. We continue to provide the support for this service with component update, component beta preview and so on.
18
+
Azure Data Lake analytics service is updated on a periodic basis. We continue to provide the support for this service with component update, component beta preview and so on.
19
19
20
20
- For recent update general information, refer to [What's new in Data Lake Analytics?](data-lake-analytics-whats-new.md).
21
21
- For each update details, refer to [Azure Data Lake analytics release note](https://github.com/Azure/AzureDataLake/tree/master/docs/Release_Notes).
22
22
23
23
## Dynamic scaling
24
24
25
-
Data Lake Analytics dynamically provisions resources and lets you do analytics on terabytes to petabytes of data. You pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute resources used, you don’t have to rewrite code.
25
+
Data Lake Analytics dynamically provisions resources and lets you do analytics on terabytes to petabytes of data. You pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute resources used, you don’t have to rewrite code.
26
26
27
27
## Develop faster, debug, and optimize smarter using familiar tools
28
28
@@ -42,19 +42,18 @@ Data Lake Analytics is a cost-effective solution for running big data workloads.
42
42
43
43
## Works with all your Azure data
44
44
45
-
Data Lake Analytics works with **Azure Data Lake Storage Gen1** for the highest performance, throughput, and parallelization and works with Azure Storage blobs, Azure SQL Database, Azure Synapse Analytics.
45
+
Data Lake Analytics works with **Azure Data Lake Storage Gen1** for the highest performance, throughput, and parallelization, and works with Azure Storage blobs, Azure SQL Database, Azure Synapse Analytics.
46
46
47
47
> [!NOTE]
48
48
> Data Lake Analytics doesn't work with Azure Data Lake Storage Gen2 yet until further notice.
49
49
50
50
## In-region data residency
51
51
52
-
Data Lake Analytics does not move or store customer data out of the region in which it is deployed.
53
-
52
+
Data Lake Analytics doesn't move or store customer data out of the region in which it's deployed.
54
53
55
54
## Next steps
56
55
57
-
* See the Azure Data Lake Analytics recent update using [What's new in Azure Data Lake Analytics?](data-lake-analytics-whats-new.md)
58
-
* Get Started with Data Lake Analytics using [Azure portal](data-lake-analytics-get-started-portal.md) | [Azure PowerShell](data-lake-analytics-get-started-powershell.md) | [CLI](data-lake-analytics-get-started-cli.md)
59
-
* Manage Azure Data Lake Analytics using [Azure portal](data-lake-analytics-manage-use-portal.md) | [Azure PowerShell](data-lake-analytics-manage-use-powershell.md) | [CLI](data-lake-analytics-manage-use-cli.md) | [Azure .NET SDK](data-lake-analytics-manage-use-dotnet-sdk.md) | [Node.js](data-lake-analytics-manage-use-nodejs.md)
60
-
*[How to control costs and save money with Data Lake Analytics](https://1drv.ms/f/s!AvdZLquGMt47h213Hg3rhl-Tym1c)
56
+
- See the Azure Data Lake Analytics recent update using [What's new in Azure Data Lake Analytics?](data-lake-analytics-whats-new.md)
57
+
- Get Started with Data Lake Analytics using [Azure portal](data-lake-analytics-get-started-portal.md) | [Azure PowerShell](data-lake-analytics-get-started-powershell.md) | [CLI](data-lake-analytics-get-started-cli.md)
58
+
- Manage Azure Data Lake Analytics using [Azure portal](data-lake-analytics-manage-use-portal.md) | [Azure PowerShell](data-lake-analytics-manage-use-powershell.md) | [CLI](data-lake-analytics-manage-use-cli.md) | [Azure .NET SDK](data-lake-analytics-manage-use-dotnet-sdk.md) | [Node.js](data-lake-analytics-manage-use-nodejs.md)
59
+
-[How to control costs and save money with Data Lake Analytics](https://1drv.ms/f/s!AvdZLquGMt47h213Hg3rhl-Tym1c)
0 commit comments