Skip to content

Commit daeb9f7

Browse files
authored
Merge pull request #214630 from whhender/ADLA-freshness-sweep1
ADLA Freshness sweep
2 parents 81427cd + 295ab50 commit daeb9f7

12 files changed

+107
-55
lines changed

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-for-vscode.md

Lines changed: 8 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,13 +2,15 @@
22
title: Use Azure Data Lake Tools for Visual Studio Code
33
description: Learn how to use Azure Data Lake Tools for Visual Studio Code to create, test, and run U-SQL scripts.
44
ms.service: data-lake-analytics
5-
ms.reviewer: jasonh
5+
ms.reviewer: whhender
66
ms.topic: how-to
7-
ms.date: 02/09/2018
7+
ms.date: 10/17/2022
88
---
99

1010
# Use Azure Data Lake Tools for Visual Studio Code
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
In this article, learn how you can use Azure Data Lake Tools for Visual Studio Code (VS Code) to create, test, and run U-SQL scripts. The information is also covered in the following video:
1315

1416
![Video player: Azure Data Lake tools for VS Code](media/data-lake-analytics-data-lake-tools-for-vscode/data-lake-tools-for-vscode-video.png)
@@ -131,7 +133,7 @@ You can set the default context to apply this setting to all script files if you
131133
2. Enter **ADL: Set Git Ignore**.
132134

133135
- If you don’t have a **.gitIgnore** file in your VS Code working folder, a file named **.gitIgnore** is created in your folder. Four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) are added in the file by default. You can make more updates if needed.
134-
- If you already have a **.gitIgnore** file in your VS Code working folder, the tool adds four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) in your **.gitIgnore** file if the four items were not included in the file.
136+
- If you already have a **.gitIgnore** file in your VS Code working folder, the tool adds four items (**usqlCodeBehindReference**, **usqlCodeBehindGenerated**, **.cache**, **obj**) in your **.gitIgnore** file if the four items weren't included in the file.
135137

136138
![Items in the .gitIgnore file](./media/data-lake-analytics-data-lake-tools-for-vscode/data-lake-tools-for-gitignore.png)
137139

@@ -214,7 +216,7 @@ U-SQL local run tests your local data and validates your script locally before y
214216
- Step through the code.
215217
- Validate your script locally.
216218

217-
The local run and local debug feature only works in Windows environments, and is not supported on macOS and Linux-based operating systems.
219+
The local run and local debug feature only works in Windows environments, and isn't supported on macOS and Linux-based operating systems.
218220

219221
For instructions on local run and local debug, see [U-SQL local run and local debug with Visual Studio Code](data-lake-tools-for-vscode-local-run-and-debug.md).
220222

@@ -228,7 +230,7 @@ Before you can compile and run U-SQL scripts in Data Lake Analytics, you must co
228230

229231
1. Select Ctrl+Shift+P to open the command palette.
230232

231-
2. Enter **ADL: Login**. The login information appears on the lower right.
233+
2. Enter **ADL: Login**. The sign in information appears on the lower right.
232234

233235
![Entering the login command](./media/data-lake-analytics-data-lake-tools-for-vscode/data-lake-tools-for-vscode-extension-login.png)
234236

@@ -268,7 +270,7 @@ You can create an extraction script for .csv, .tsv, and .txt files by using the
268270

269271
![Process for creating an extraction script](./media/data-lake-analytics-data-lake-tools-for-vscode/create-extract-script-process.png)
270272

271-
The extraction script is generated based on your entries. For a script that cannot detect the columns, choose one from the two options. If not, only one script will be generated.
273+
The extraction script is generated based on your entries. For a script that can't detect the columns, choose one from the two options. If not, only one script will be generated.
272274

273275
![Result of creating an extraction script](./media/data-lake-analytics-data-lake-tools-for-vscode/create-extract-script-result.png)
274276

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-get-started.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,8 @@ ms.date: 08/30/2019
1111

1212
[!INCLUDE [get-started-selector](../../includes/data-lake-analytics-selector-get-started.md)]
1313

14+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
15+
1416
Azure Data Lake and Stream Analytics Tools include functionality related to two Azure services, Azure Data Lake Analytics and Azure Stream Analytics. For more information about the Azure Stream Analytics scenarios, see [Azure Stream Analytics tools for Visual Studio](../stream-analytics/stream-analytics-tools-for-visual-studio-install.md).
1517

1618
This article describes how to use Visual Studio to create Azure Data Lake Analytics accounts. You can define jobs in [U-SQL](data-lake-analytics-u-sql-get-started.md), and submit jobs to the Data Lake Analytics service. For more information about Data Lake Analytics, see [Azure Data Lake Analytics overview](data-lake-analytics-overview.md).

articles/data-lake-analytics/data-lake-analytics-diagnostic-logs.md

Lines changed: 16 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ ms.service: data-lake-analytics
55

66

77
ms.topic: how-to
8-
ms.date: 02/12/2018
8+
ms.date: 10/14/2022
99
---
1010
# Accessing diagnostic logs for Azure Data Lake Analytics
1111

@@ -19,33 +19,36 @@ Diagnostic logging allows you to collect data access audit trails. These logs pr
1919

2020
1. Sign on to the [Azure portal](https://portal.azure.com).
2121

22-
2. Open your Data Lake Analytics account and select **Diagnostic logs** from the __Monitor__ section. Next, select __Turn on diagnostics__.
22+
2. Open your Data Lake Analytics account and select **Diagnostic settings** from the **Monitoring** section. Next, select **+ Add diagnostic setting**.
2323

24-
![Screenshot that shows the "Diagnostic logs" action selected and "Turn on diagnostics to collect the following logs" highlighted.](./media/data-lake-analytics-diagnostic-logs/turn-on-logging.png)
24+
![Screenshot that shows the "Diagnostic settings" action selected and "+ Add diagnostic setting" highlighted.](./media/data-lake-analytics-diagnostic-logs/turn-on-logging.png)
2525

26-
3. From __Diagnostics settings__, enter a __Name__ for this logging configuration and then select logging options.
26+
3. From **Diagnostics setting**, enter a name for this logging configuration and then select logging options.
2727

28-
![Turn on diagnostics to collect audit and request logs](./media/data-lake-analytics-diagnostic-logs/enable-diagnostic-logs.png "Enable diagnostic logs")
28+
![Screenshot showing settings to turn on diagnostics to collect audit and request logs](./media/data-lake-analytics-diagnostic-logs/enable-diagnostic-logs.png "Enable diagnostic logs")
2929

30-
* You can choose to store/process the data in three different ways.
30+
* You can choose to store/process the data in four different ways.
3131

32-
* Select __Archive to a storage account__ to store logs in an Azure storage account. Use this option if you want to archive the data. If you select this option, you must provide an Azure storage account to save the logs to.
32+
* Select **Archive to a storage account** to store logs in an Azure storage account. Use this option if you want to archive the data. If you select this option, you must provide an Azure storage account to save the logs to.
3333

34-
* Select **Stream to an Event Hub** to stream log data to an Azure Event Hub. Use this option if you have a downstream processing pipeline that is analyzing incoming logs in real time. If you select this option, you must provide the details for the Azure Event Hub you want to use.
34+
* Select **Stream to an event hub** to stream log data to an Azure Event Hub. Use this option if you have a downstream processing pipeline that is analyzing incoming logs in real time. If you select this option, you must provide the details for the Azure Event Hub you want to use.
35+
36+
* Select **Send to Log Analytics workspace** to send the data to the Azure Monitor service. Use this option if you want to use Azure Monitor logs to gather and analyze logs.
37+
38+
* Select **send to partner solution** if you want to use our partner integration. For more information, you can [follow this link.](../partner-solutions/overview.md)
3539

36-
* Select __Send to Log Analytics__ to send the data to the Azure Monitor service. Use this option if you want to use Azure Monitor logs to gather and analyze logs.
3740
* Specify whether you want to get audit logs or request logs or both. A request log captures every API request. An audit log records all operations that are triggered by that API request.
3841

39-
* For __Archive to a storage account__, specify the number of days to retain the data.
42+
* For **Archive to a storage account**, specify the number of days to retain the data.
4043

41-
* Click __Save__.
44+
* Select **Save**.
4245

4346
> [!NOTE]
44-
> You must select either __Archive to a storage account__, __Stream to an Event Hub__ or __Send to Log Analytics__ before clicking the __Save__ button.
47+
> You must select either **Archive to a storage account**, **Stream to an Event Hub**, **Send to Log Analytics workspace**, or **Send to partner solution** before selecting the **Save** button.
4548
4649
### Use the Azure Storage account that contains log data
4750

48-
1. To display the blob containers that hold logging data, open the Azure Storage account used for Data Lake Analytics for logging, and then click __Blobs__.
51+
1. To display the blob containers that hold logging data, open the Azure Storage account used for Data Lake Analytics for logging, and then select **Containers**.
4952

5053
* The container **insights-logs-audit** contains the audit logs.
5154
* The container **insights-logs-requests** contains the request logs.

articles/data-lake-analytics/data-lake-analytics-get-started-cli.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,8 @@ ms.custom: devx-track-azurecli
1111

1212
[!INCLUDE [get-started-selector](../../includes/data-lake-analytics-selector-get-started.md)]
1313

14+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
15+
1416
This article describes how to use the Azure CLI command-line interface to create Azure Data Lake Analytics accounts, submit USQL jobs, and catalogs. The job reads a tab separated values (TSV) file and converts it into a comma-separated values (CSV) file.
1517

1618
## Prerequisites

articles/data-lake-analytics/data-lake-analytics-get-started-portal.md

Lines changed: 12 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -2,39 +2,41 @@
22
title: Create & query Azure Data Lake Analytics - Azure portal
33
description: Use the Azure portal to create an Azure Data Lake Analytics account and submit a U-SQL job.
44
ms.service: data-lake-analytics
5-
ms.reviewer: jasonh
5+
ms.reviewer: whhender
66
ms.topic: conceptual
7-
ms.date: 03/21/2017
7+
ms.date: 10/14/2022
88
---
99

1010
# Get started with Azure Data Lake Analytics using the Azure portal
1111
[!INCLUDE [get-started-selector](../../includes/data-lake-analytics-selector-get-started.md)]
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
This article describes how to use the Azure portal to create Azure Data Lake Analytics accounts, define jobs in [U-SQL](data-lake-analytics-u-sql-get-started.md), and submit jobs to the Data Lake Analytics service.
1416

1517
## Prerequisites
1618

17-
Before you begin this tutorial, you must have an **Azure subscription**. See [Get Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
19+
Before you begin this tutorial, you must have an **Azure subscription**. If you don't, you can follow this link to [get an Azure free trial](https://azure.microsoft.com/pricing/free-trial/).
1820

1921
## Create a Data Lake Analytics account
2022

21-
Now, you will create a Data Lake Analytics and an Azure Data Lake Storage Gen1 account at the same time. This step is simple and only takes about 60 seconds to finish.
23+
Now, you'll create a Data Lake Analytics and an Azure Data Lake Storage Gen1 account at the same time. This step is simple and only takes about 60 seconds to finish.
2224

2325
1. Sign on to the [Azure portal](https://portal.azure.com).
24-
2. Click **Create a resource** > **Data + Analytics** > **Data Lake Analytics**.
25-
3. Select values for the following items:
26+
1. Select **Create a resource**, and in the search at the top of the page enter **Data Lake Analytics**.
27+
1. Select values for the following items:
2628
* **Name**: Name your Data Lake Analytics account (Only lower case letters and numbers allowed).
2729
* **Subscription**: Choose the Azure subscription used for the Analytics account.
2830
* **Resource Group**. Select an existing Azure Resource Group or create a new one.
2931
* **Location**. Select an Azure data center for the Data Lake Analytics account.
30-
* **Data Lake Storage Gen1**: Follow the instruction to create a new Data Lake Storage Gen1 account, or select an existing one.
31-
4. Optionally, select a pricing tier for your Data Lake Analytics account.
32-
5. Click **Create**.
32+
* **Data Lake Storage Gen1**: Follow the instruction to create a new Data Lake Storage Gen1 account, or select an existing one.
33+
1. Optionally, select a pricing tier for your Data Lake Analytics account.
34+
1. Select **Create**.
3335

3436

3537
## Your first U-SQL script
3638

37-
The following text is a very simple U-SQL script. All it does is define a small dataset within the script and then write that dataset out to the default Data Lake Storage Gen1 account as a file called `/data.csv`.
39+
The following text is a simple U-SQL script. All it does is define a small dataset within the script and then write that dataset out to the default Data Lake Storage Gen1 account as a file called `/data.csv`.
3840

3941
```usql
4042
@a =

articles/data-lake-analytics/data-lake-analytics-get-started-powershell.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,8 @@ ms.custom: devx-track-azurepowershell
1111

1212
[!INCLUDE [get-started-selector](../../includes/data-lake-analytics-selector-get-started.md)]
1313

14+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
15+
1416
Learn how to use Azure PowerShell to create Azure Data Lake Analytics accounts and then submit and run U-SQL jobs. For more information about Data Lake Analytics, see [Azure Data Lake Analytics overview](data-lake-analytics-overview.md).
1517

1618
## Prerequisites
Lines changed: 12 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,28 @@
11
---
22
title: Overview of Azure Data Lake Analytics
3-
description: Data Lake Analytics lets you drive you business using insights gained in your cloud data at any scale.
3+
description: Data Lake Analytics lets you drive your business using insights gained in your cloud data at any scale.
44
author: saveenr
55
ms.author: saveenr
66

7-
ms.reviewer: jasonwhowell
7+
ms.reviewer: whhender
88
ms.service: data-lake-analytics
99
ms.topic: overview
10-
ms.date: 06/23/2017
10+
ms.date: 10/17/2022
1111
---
1212
# What is Azure Data Lake Analytics?
1313

14-
Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Instead of deploying, configuring, and tuning hardware, you write queries to transform your data and extract valuable insights. The analytics service can handle jobs of any scale instantly by setting the dial for how much power you need. You only pay for your job when it is running, making it cost-effective.
14+
Azure Data Lake Analytics is an on-demand analytics job service that simplifies big data. Instead of deploying, configuring, and tuning hardware, you write queries to transform your data and extract valuable insights. The analytics service can handle jobs of any scale instantly by setting the dial for how much power you need. You only pay for your job when it's running, making it cost-effective.
1515

1616
## Azure Data Lake analytics recent update information
1717

18-
Azure Data Lake analytics service is updated on an aperiodic basis for certain purpose. We continue to provide the support for this service with component update, component beta preview and so on.
18+
Azure Data Lake analytics service is updated on a periodic basis. We continue to provide the support for this service with component update, component beta preview and so on.
1919

2020
- For recent update general information, refer to [What's new in Data Lake Analytics?](data-lake-analytics-whats-new.md).
2121
- For each update details, refer to [Azure Data Lake analytics release note](https://github.com/Azure/AzureDataLake/tree/master/docs/Release_Notes).
2222

2323
## Dynamic scaling
2424

25-
Data Lake Analytics dynamically provisions resources and lets you do analytics on terabytes to petabytes of data. You pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute resources used, you don’t have to rewrite code.
25+
Data Lake Analytics dynamically provisions resources and lets you do analytics on terabytes to petabytes of data. You pay only for the processing power used. As you increase or decrease the size of data stored or the amount of compute resources used, you don’t have to rewrite code.
2626

2727
## Develop faster, debug, and optimize smarter using familiar tools
2828

@@ -42,19 +42,18 @@ Data Lake Analytics is a cost-effective solution for running big data workloads.
4242

4343
## Works with all your Azure data
4444

45-
Data Lake Analytics works with **Azure Data Lake Storage Gen1** for the highest performance, throughput, and parallelization and works with Azure Storage blobs, Azure SQL Database, Azure Synapse Analytics.
45+
Data Lake Analytics works with **Azure Data Lake Storage Gen1** for the highest performance, throughput, and parallelization, and works with Azure Storage blobs, Azure SQL Database, Azure Synapse Analytics.
4646

4747
> [!NOTE]
4848
> Data Lake Analytics doesn't work with Azure Data Lake Storage Gen2 yet until further notice.
4949
5050
## In-region data residency
5151

52-
Data Lake Analytics does not move or store customer data out of the region in which it is deployed.
53-
52+
Data Lake Analytics doesn't move or store customer data out of the region in which it's deployed.
5453

5554
## Next steps
5655

57-
* See the Azure Data Lake Analytics recent update using [What's new in Azure Data Lake Analytics?](data-lake-analytics-whats-new.md)
58-
* Get Started with Data Lake Analytics using [Azure portal](data-lake-analytics-get-started-portal.md) | [Azure PowerShell](data-lake-analytics-get-started-powershell.md) | [CLI](data-lake-analytics-get-started-cli.md)
59-
* Manage Azure Data Lake Analytics using [Azure portal](data-lake-analytics-manage-use-portal.md) | [Azure PowerShell](data-lake-analytics-manage-use-powershell.md) | [CLI](data-lake-analytics-manage-use-cli.md) | [Azure .NET SDK](data-lake-analytics-manage-use-dotnet-sdk.md) | [Node.js](data-lake-analytics-manage-use-nodejs.md)
60-
* [How to control costs and save money with Data Lake Analytics](https://1drv.ms/f/s!AvdZLquGMt47h213Hg3rhl-Tym1c)
56+
- See the Azure Data Lake Analytics recent update using [What's new in Azure Data Lake Analytics?](data-lake-analytics-whats-new.md)
57+
- Get Started with Data Lake Analytics using [Azure portal](data-lake-analytics-get-started-portal.md) | [Azure PowerShell](data-lake-analytics-get-started-powershell.md) | [CLI](data-lake-analytics-get-started-cli.md)
58+
- Manage Azure Data Lake Analytics using [Azure portal](data-lake-analytics-manage-use-portal.md) | [Azure PowerShell](data-lake-analytics-manage-use-powershell.md) | [CLI](data-lake-analytics-manage-use-cli.md) | [Azure .NET SDK](data-lake-analytics-manage-use-dotnet-sdk.md) | [Node.js](data-lake-analytics-manage-use-nodejs.md)
59+
- [How to control costs and save money with Data Lake Analytics](https://1drv.ms/f/s!AvdZLquGMt47h213Hg3rhl-Tym1c)

0 commit comments

Comments
 (0)