Skip to content

Commit eff0a41

Browse files
Merge pull request #218612 from whhender/ADLA-freshness-sweep2
Adla freshness sweep2
2 parents 5ac4809 + 3d29bae commit eff0a41

6 files changed

+53
-58
lines changed

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-get-started.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22
title: Query Azure Data Lake Analytics - Visual Studio
33
description: Learn how to install Data Lake Tools for Visual Studio, and how to develop and test U-SQL scripts.
44
ms.service: data-lake-analytics
5-
ms.reviewer: jasonh
5+
ms.reviewer: whhender
66
ms.topic: how-to
7-
ms.date: 08/30/2019
7+
ms.date: 11/15/2022
88
---
99

1010
# Develop U-SQL scripts by using Data Lake Tools for Visual Studio
@@ -96,7 +96,7 @@ After the job submission, the **Job view** tab opens to show the job progress.
9696
* **MetaData Operations** shows all the actions that were taken on the U-SQL catalog.
9797
* **Data** shows all the inputs and outputs.
9898
* **State History** shows the timeline and state details.
99-
* **AU Analysis** shows how many AUs were used in the job and explore simulations of different AU allocation strategies.
99+
* **AU Analysis** shows how many AUs (analytics units) were used in the job and explore simulations of different AU allocation strategies.
100100
* **Diagnostics** provides an advanced analysis for job execution and performance optimization.
101101

102102
![U-SQL Visual Studio Data Lake Analytics job performance graph](./media/data-lake-analytics-data-lake-tools-get-started/data-lake-analytics-data-lake-tools-performance-graph.png)
@@ -105,7 +105,7 @@ To see the latest job status and refresh the screen, select **Refresh**.
105105

106106
## Check job status
107107

108-
1. In **Server Explorer**, select **Azure** > **Data Lake Analytics**.
108+
1. In **Data Lake Analytics Explorer**, select **Data Lake Analytics**.
109109

110110
1. Expand the Data Lake Analytics account name.
111111

@@ -115,9 +115,9 @@ To see the latest job status and refresh the screen, select **Refresh**.
115115

116116
## See the job output
117117

118-
1. In **Server Explorer**, browse to the job you submitted.
118+
1. In **Data Lake Analytics Explorer**, browse to the job you submitted.
119119

120-
1. Click the **Data** tab.
120+
1. Select the **Data** tab in your job.
121121

122122
1. In the **Job Outputs** tab, select the `"/data.csv"` file.
123123

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-install.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,9 +2,9 @@
22
title: Install Azure Data Lake Tools for Visual Studio
33
description: This article describes how to install Azure Data Lake Tools for Visual Studio.
44
ms.service: data-lake-analytics
5-
ms.reviewer: jasonh
5+
ms.reviewer: whhender
66
ms.topic: how-to
7-
ms.date: 08/30/2019
7+
ms.date: 11/15/2022
88
---
99
# Install Data Lake Tools for Visual Studio
1010

@@ -20,7 +20,7 @@ information about Data Lake Analytics, see [Azure Data Lake Analytics overview](
2020
* Visual Studio 2015
2121
* Visual Studio 2013
2222

23-
* **Microsoft Azure SDK for .NET** version 2.7.1 or later. Install it by using the [Web platform installer](https://www.microsoft.com/web/downloads/platform.aspx).
23+
* **Microsoft Azure SDK for .NET** [version 2.7.1 or later](https://azure.microsoft.com/downloads/).
2424
* A **Data Lake Analytics** account. To create an account, see [Get Started with Azure Data Lake Analytics using Azure portal](data-lake-analytics-get-started-portal.md).
2525

2626
## Install Azure Data Lake Tools for Visual Studio 2017 or Visual Studio 2019

articles/data-lake-analytics/data-lake-analytics-diagnostic-logs.md

Lines changed: 1 addition & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -2,10 +2,8 @@
22
title: Enable and view diagnostic logs for Azure Data Lake Analytics
33
description: Understand how to set up and access diagnostic logs for Azure Data Lake Analytics
44
ms.service: data-lake-analytics
5-
6-
75
ms.topic: how-to
8-
ms.date: 10/14/2022
6+
ms.date: 11/15/2022
97
---
108
# Accessing diagnostic logs for Azure Data Lake Analytics
119

articles/data-lake-analytics/data-lake-analytics-manage-use-portal.md

Lines changed: 35 additions & 36 deletions
Original file line numberDiff line numberDiff line change
@@ -2,45 +2,41 @@
22
title: Manage Azure Data Lake Analytics by using the Azure portal
33
description: This article describes how to use the Azure portal to manage Data Lake Analytics accounts, data sources, users, & jobs.
44
ms.service: data-lake-analytics
5-
ms.reviewer: jasonh
5+
ms.reviewer: whhender
66
ms.topic: how-to
7-
ms.date: 12/05/2016
7+
ms.date: 11/15/2022
88
ms.custom: subject-rbac-steps
99
---
1010
# Manage Azure Data Lake Analytics using the Azure portal
1111
[!INCLUDE [manage-selector](../../includes/data-lake-analytics-selector-manage.md)]
1212

13-
This article describes how to manage Azure Data Lake Analytics accounts, data sources, users, and jobs by using the Azure portal.
14-
13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
1514

16-
<!-- ################################ -->
17-
<!-- ################################ -->
15+
This article describes how to manage Azure Data Lake Analytics accounts, data sources, users, and jobs by using the Azure portal.
1816

1917
## Manage Data Lake Analytics accounts
2018

2119
### Create an account
2220

2321
1. Sign in to the [Azure portal](https://portal.azure.com).
24-
2. Click **Create a resource** > **Intelligence + analytics** > **Data Lake Analytics**.
22+
2. Select **Create a resource** and search for **Data Lake Analytics**.
2523
3. Select values for the following items:
2624
1. **Name**: The name of the Data Lake Analytics account.
2725
2. **Subscription**: The Azure subscription used for the account.
2826
3. **Resource Group**: The Azure resource group in which to create the account.
2927
4. **Location**: The Azure datacenter for the Data Lake Analytics account.
3028
5. **Data Lake Store**: The default store to be used for the Data Lake Analytics account. The Azure Data Lake Store account and the Data Lake Analytics account must be in the same location.
31-
4. Click **Create**.
29+
4. Select **Create**.
3230

3331
### Delete a Data Lake Analytics account
3432

3533
Before you delete a Data Lake Analytics account, delete its default Data Lake Store account.
3634

3735
1. In the Azure portal, go to your Data Lake Analytics account.
38-
2. Click **Delete**.
36+
2. Select **Delete**.
3937
3. Type the account name.
40-
4. Click **Delete**.
38+
4. Select **Delete**.
4139

42-
<!-- ################################ -->
43-
<!-- ################################ -->
4440

4541
## Manage data sources
4642

@@ -54,11 +50,11 @@ You can use Data Explorer to browse data sources and perform basic file manageme
5450
### Add a data source
5551

5652
1. In the Azure portal, go to your Data Lake Analytics account.
57-
2. Click **Data Sources**.
58-
3. Click **Add Data Source**.
53+
2. Select **Data explorer**.
54+
3. Select **Add Data Source**.
5955

6056
* To add a Data Lake Store account, you need the account name and access to the account to be able to query it.
61-
* To add Azure Blob storage, you need the storage account and the account key. To find them, go to the storage account in the portal.
57+
* To add Azure Blob storage, you need the storage account and the account key. To find them, go to the storage account in the portal and select **Access keys**.
6258

6359
## Set up firewall rules
6460

@@ -69,20 +65,20 @@ If other Azure services, like Azure Data Factory or VMs, connect to the Data Lak
6965
### Set up a firewall rule
7066

7167
1. In the Azure portal, go to your Data Lake Analytics account.
72-
2. On the menu on the left, click **Firewall**.
68+
2. On the menu on the left, select **Firewall**.
7369

7470
## Add a new user
7571

7672
You can use the **Add User Wizard** to easily provision new Data Lake users.
7773

7874
1. In the Azure portal, go to your Data Lake Analytics account.
79-
2. On the left, under **Getting Started**, click **Add User Wizard**.
80-
3. Select a user, and then click **Select**.
81-
4. Select a role, and then click **Select**. To set up a new developer to use Azure Data Lake, select the **Data Lake Analytics Developer** role.
82-
5. Select the access control lists (ACLs) for the U-SQL databases. When you're satisfied with your choices, click **Select**.
83-
6. Select the ACLs for files. For the default store, don't change the ACLs for the root folder "/" and for the /system folder. Click **Select**.
84-
7. Review all your selected changes, and then click **Run**.
85-
8. When the wizard is finished, click **Done**.
75+
2. On the left, under **Getting Started**, select **Add User Wizard**.
76+
3. Select a user, and then select **Select**.
77+
4. Select a role, and then select **Select**. To set up a new developer to use Azure Data Lake, select the **Data Lake Analytics Developer** role.
78+
5. Select the access control lists (ACLs) for the U-SQL databases. When you're satisfied with your choices, select **Select**.
79+
6. Select the ACLs for files. For the default store, don't change the ACLs for the root folder "/" and for the /system folder. select **Select**.
80+
7. Review all your selected changes, and then select **Run**.
81+
8. When the wizard is finished, select **Done**.
8682

8783
## Manage Azure role-based access control
8884

@@ -94,6 +90,7 @@ The standard Azure roles have the following capabilities:
9490
* **Reader**: Can monitor jobs.
9591

9692
Use the Data Lake Analytics Developer role to enable U-SQL developers to use the Data Lake Analytics service. You can use the Data Lake Analytics Developer role to:
93+
9794
* Submit jobs.
9895
* Monitor job status and the progress of jobs submitted by any user.
9996
* See the U-SQL scripts from jobs submitted by any user.
@@ -115,46 +112,48 @@ Use the Data Lake Analytics Developer role to enable U-SQL developers to use the
115112
>If a user or a security group needs to submit jobs, they also need permission on the store account. For more information, see [Secure data stored in Data Lake Store](../data-lake-store/data-lake-store-secure-data.md).
116113
>
117114
118-
<!-- ################################ -->
119-
<!-- ################################ -->
120-
121115
## Manage jobs
122116

123117
### Submit a job
124118

125119
1. In the Azure portal, go to your Data Lake Analytics account.
126120

127-
2. Click **New Job**. For each job, configure:
121+
2. Select **New Job**. For each job, configure:
128122

129123
1. **Job Name**: The name of the job.
130-
2. **Priority**: Lower numbers have higher priority. If two jobs are queued, the one with lower priority value runs first.
131-
3. **Parallelism**: The maximum number of compute processes to reserve for this job.
124+
2. **Priority**: This is under **More options**. Lower numbers have higher priority. If two jobs are queued, the one with lower priority value runs first.
125+
3. **AUs**: The maximum number of Analytics Units, or compute processes to reserve for this job.
126+
4. **Runtime**: Also under **More options**. Select the Default runtime unless you've received a custom runtime.
127+
128+
3. Add your script.
132129

133-
3. Click **Submit Job**.
130+
4. Select **Submit Job**.
134131

135132
### Monitor jobs
136133

137134
1. In the Azure portal, go to your Data Lake Analytics account.
138-
2. Click **View All Jobs**. A list of all the active and recently finished jobs in the account is shown.
139-
3. Optionally, click **Filter** to help you find the jobs by **Time Range**, **Job Name**, and **Author** values.
135+
2. Select **View All Jobs** at the top of the page. A list of all the active and recently finished jobs in the account is shown.
136+
3. Optionally, select **Filter** to help you find the jobs by **Time Range**, **Status**, **Job Name**, **Job ID**, **Pipeline name** or **Pipeline ID**, **Recurrence name** or **Recurrence ID**, and **Author** values.
140137

141138
### Monitoring pipeline jobs
139+
142140
Jobs that are part of a pipeline work together, usually sequentially, to accomplish a specific scenario. For example, you can have a pipeline that cleans, extracts, transforms, aggregates usage for customer insights. Pipeline jobs are identified using the "Pipeline" property when the job was submitted. Jobs scheduled using ADF V2 will automatically have this property populated.
143141

144142
To view a list of U-SQL jobs that are part of pipelines:
145143

146144
1. In the Azure portal, go to your Data Lake Analytics accounts.
147-
2. Click **Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
148-
3. Click the **Pipeline Jobs** tab. A list of pipeline jobs will be shown along with aggregated statistics for each pipeline.
145+
2. Select **Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
146+
3. Select the **Pipeline Jobs** tab. A list of pipeline jobs will be shown along with aggregated statistics for each pipeline.
149147

150148
### Monitoring recurring jobs
149+
151150
A recurring job is one that has the same business logic but uses different input data every time it runs. Ideally, recurring jobs should always succeed, and have relatively stable execution time; monitoring these behaviors will help ensure the job is healthy. Recurring jobs are identified using the "Recurrence" property. Jobs scheduled using ADF V2 will automatically have this property populated.
152151

153152
To view a list of U-SQL jobs that are recurring:
154153

155154
1. In the Azure portal, go to your Data Lake Analytics accounts.
156-
2. Click **Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
157-
3. Click the **Recurring Jobs** tab. A list of recurring jobs will be shown along with aggregated statistics for each recurring job.
155+
2. Select **Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
156+
3. Select the **Recurring Jobs** tab. A list of recurring jobs will be shown along with aggregated statistics for each recurring job.
158157

159158
## Next steps
160159

articles/data-lake-analytics/data-lake-analytics-whats-new.md

Lines changed: 5 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -5,11 +5,13 @@ author: xujiang1
55
ms.service: data-lake-analytics
66
ms.topic: overview
77
ms.author: xujiang1
8-
ms.date: 07/31/2020
8+
ms.date: 11/16/2022
99
---
1010

1111
# What's new in Data Lake Analytics?
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
Azure Data Lake Analytics is updated on an aperiodic basis for certain components. To stay updated with the most recent update, this article provides you with information about:
1416

1517
- The notification of key component beta preview
@@ -18,7 +20,7 @@ Azure Data Lake Analytics is updated on an aperiodic basis for certain component
1820

1921
## Notification of key component beta preview
2022

21-
No key component beta version available for preview.
23+
No key component beta version available for preview.
2224

2325
## U-SQL runtime
2426

@@ -32,17 +34,13 @@ The runtime version will be updated aperiodically. And the previous runtime will
3234
> - Choosing a runtime that is different from the default has the potential to break your U-SQL jobs. It is highly recommended not to use these non-default versions for production, but for testing only.
3335
> - The non-default runtime version has a fixed lifecycle. It will be automatically expired.
3436
35-
The following version is the current default runtime version.
36-
37-
- **release_20200707_scope_2b8d563_usql**
38-
3937
To get understanding how to troubleshoot U-SQL runtime failures, refer to [Troubleshoot U-SQL runtime failures](runtime-troubleshoot.md).
4038

4139
## .NET Framework
4240

4341
Azure Data Lake Analytics now is using the **.NET Framework v4.7.2**.
4442

45-
If your Azure Data Lake Analytics U-SQL script code uses custom assemblies, and those custom assemblies use .NET libraries, validate your code to check if there is any breakings.
43+
If your Azure Data Lake Analytics U-SQL script code uses custom assemblies, and those custom assemblies use .NET libraries, validate your code to check if there are any errors.
4644

4745
To get understanding how to troubleshoot a .NET upgrade using [Troubleshoot a .NET upgrade](runtime-troubleshoot.md).
4846

articles/data-lake-analytics/migrate-azure-data-lake-analytics-to-synapse.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -6,12 +6,12 @@ ms.author: lingluo
66
ms.service: data-lake-analytics
77
ms.topic: how-to
88
ms.custom: migrate-azure-data-lake-analytics-to-synapse
9-
ms.date: 08/25/2021
9+
ms.date: 11/15/2022
1010
---
1111

1212
# Migrate Azure Data Lake Analytics to Azure Synapse Analytics
1313

14-
Microsoft launched the Azure Synapse Analytics which aims at bringing both data lakes and data warehouse together for a unique big data analytics experience. It will help customers gather and analyze all the varying data, to solve data inefficiency, and work together. Moreover, Synapse’s integration with Azure Machine Learning and Power BI will allow the improved ability for organizations to get insights from its data as well as execute machine learning to all its smart apps.
14+
Microsoft launched the Azure Synapse Analytics that aims at bringing both data lakes and data warehouse together for a unique big data analytics experience. It will help customers gather and analyze all the varying data, to solve data inefficiency, and work together. Moreover, Synapse’s integration with Azure Machine Learning and Power BI will allow the improved ability for organizations to get insights from its data and execute machine learning to all its smart apps.
1515

1616
The document shows you how to do the migration from Azure Data Lake Analytics to Azure Synapse Analytics.
1717

@@ -38,7 +38,7 @@ The document shows you how to do the migration from Azure Data Lake Analytics to
3838

3939
1. Identify jobs and data that you'll migrate.
4040
- Take this opportunity to clean up those jobs that you no longer use. Unless you plan to migrate all your jobs at one time, take this time to identify logical groups of jobs that you can migrate in phases.
41-
- Evaluate the size of the data and understand Apache Spark data format. Review your U-SQL scripts and evaluate the scripts re-writing efforts and understand the Apache Spark code concept.
41+
- Evaluate the size of the data and understand Apache Spark data format. Review your U-SQL scripts and evaluate the scripts rewriting efforts and understand the Apache Spark code concept.
4242

4343
2. Determine the impact that a migration will have on your business. For example, whether you can afford any downtime while migration takes place.
4444

0 commit comments

Comments
 (0)