Skip to content

Commit 8525170

Browse files
authored
Merge pull request #223841 from whhender/retirement-flag
Retirement flag
2 parents 4ffc4f3 + 5e70d1c commit 8525170

File tree

57 files changed

+125
-24
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

57 files changed

+125
-24
lines changed

articles/data-lake-analytics/data-lake-analytics-account-policies.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -8,11 +8,15 @@ ms.date: 04/30/2018
88
---
99
# Manage Azure Data Lake Analytics using Account Policies
1010

11-
Account policies help you control how resources an Azure Data Lake Analytics account are used. These policies allow you to control the cost of using Azure Data Lake Analytics. For example, with these policies you can prevent unexpected cost spikes by limiting how many AUs the account can simultaneously use.## Account-level policies
11+
Account policies help you control how resources an Azure Data Lake Analytics account are used. These policies allow you to control the cost of using Azure Data Lake Analytics. For example, with these policies you can prevent unexpected cost spikes by limiting how many AUs the account can simultaneously use.
12+
13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
15+
## Account-level policies
1216

1317
These policies apply to all jobs in a Data Lake Analytics account.
1418

15-
## Maximum number of AUs in a Data Lake Analytics account
19+
### Maximum number of AUs in a Data Lake Analytics account
1620

1721
A policy controls the total number of Analytics Units (AUs) your Data Lake Analytics account can use. By default, the value is set to 250. For example, if this value is set to 250 AUs, you can have one job running with 250 AUs assigned to it, or 10 jobs running with 25 AUs each. Additional jobs that are submitted are queued until the running jobs are finished. When running jobs are finished, AUs are freed up for the queued jobs to run.
1822

@@ -26,7 +30,7 @@ To change the number of AUs for your Data Lake Analytics account:
2630
> [!NOTE]
2731
> If you need more than the default (250) AUs, in the portal, click **Help+Support** to submit a support request. The number of AUs available in your Data Lake Analytics account can be increased.
2832
29-
## Maximum number of jobs that can run simultaneously
33+
### Maximum number of jobs that can run simultaneously
3034

3135
This policy limits how many jobs can run simultaneously. By default, this value is set to 20. If your Data Lake Analytics has AUs available, new jobs are scheduled to run immediately until the total number of running jobs reaches the value of this policy. When you reach the maximum number of jobs that can run simultaneously, subsequent jobs are queued in priority order until one or more running jobs complete (depending on available AUs).
3236

@@ -40,7 +44,7 @@ To change the number of jobs that can run simultaneously:
4044
> [!NOTE]
4145
> If you need to run more than the default (20) number of jobs, in the portal, click **Help+Support** to submit a support request. The number of jobs that can run simultaneously in your Data Lake Analytics account can be increased.
4246
43-
## How long to keep job metadata and resources
47+
### How long to keep job metadata and resources
4448

4549
When your users run U-SQL jobs, the Data Lake Analytics service keeps all related files. These files include the U-SQL script, the DLL files referenced in the U-SQL script, compiled resources, and statistics. The files are in the /system/ folder of the default Azure Data Lake Storage account. This policy controls how long these resources are stored before they are automatically deleted (the default is 30 days). You can use these files for debugging, and for performance-tuning of jobs that you'll rerun in the future.
4650

articles/data-lake-analytics/data-lake-analytics-add-users.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 05/24/2018
99

1010
# Adding a user in the Azure portal
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## Start the Add User Wizard
1315
1. Open your Azure Data Lake Analytics via https://portal.azure.com.
1416
2. Click **Add User Wizard**.

articles/data-lake-analytics/data-lake-analytics-analyze-weblogs.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 12/05/2016
99
# Analyze Website logs using Azure Data Lake Analytics
1010
Learn how to analyze website logs using Data Lake Analytics, especially on finding out which referrers ran into errors when they tried to visit the website.
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## Prerequisites
1315
* **Visual Studio 2015 or Visual Studio 2013**.
1416
* **[Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs)**.

articles/data-lake-analytics/data-lake-analytics-cicd-manage-assemblies.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ ms.date: 10/30/2018
1010

1111
In this article, you learn how to manage U-SQL assembly source code with the newly introduced U-SQL database project. You also learn how to set up a continuous integration and deployment (CI/CD) pipeline for assembly registration by using Azure DevOps.
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
## Use the U-SQL database project to manage assembly source code
1416

1517
[The U-SQL database project](data-lake-analytics-data-lake-tools-develop-usql-database.md) is a project type in Visual Studio that helps developers develop, manage, and deploy their U-SQL databases quickly and easily. You can manage all U-SQL database objects (except for credentials) with the U-SQL database project.

articles/data-lake-analytics/data-lake-analytics-cicd-overview.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ ms.custom: devx-track-azurepowershell
1010

1111
In this article, you learn how to set up a continuous integration and deployment (CI/CD) pipeline for U-SQL jobs and U-SQL databases.
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
1416

1517
## Use CI/CD for U-SQL jobs

articles/data-lake-analytics/data-lake-analytics-cicd-test.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 08/30/2019
99

1010
Azure Data Lake provides the [U-SQL](data-lake-analytics-u-sql-get-started.md) language. U-SQL combines declarative SQL with imperative C# to process data at any scale. In this document, you learn how to create test cases for U-SQL and extended C# user-defined operator (UDO) code.
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## Test U-SQL scripts
1315

1416
The U-SQL script is compiled and optimized for executable code to run in Azure or on your local computer. The compilation and optimization process treats the entire U-SQL script as a whole. You can't do a traditional unit test for every statement. However, by using the U-SQL test SDK and the local run SDK, you can do script-level tests.

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-data-skew-solutions.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 12/16/2016
99

1010
# Resolve data-skew problems by using Azure Data Lake Tools for Visual Studio
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## What is data skew?
1315

1416
Briefly stated, data skew is an over-represented value. Imagine that you have assigned 50 tax examiners to audit tax returns, one examiner for each US state. The Wyoming examiner, because the population there is small, has little to do. In California, however, the examiner is kept very busy because of the state's large population.

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-debug-recurring-job.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 05/20/2018
99

1010
# Troubleshoot an abnormal recurring job
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
This article shows how to use [Azure Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs) to troubleshoot problems with recurring jobs. Learn more about pipeline and recurring jobs from the [Azure Data Lake and Azure HDInsight blog](/archive/blogs/azuredatalake/managing-pipeline-recurring-jobs-in-azure-data-lake-analytics-made-easy).
1315

1416
Recurring jobs usually share the same query logic and similar input data. For example, imagine that you have a recurring job running every Monday morning at 8 A.M. to count last week’s weekly active user. The scripts for these jobs share one script template that contains the query logic. The inputs for these jobs are the usage data for last week. Sharing the same query logic and similar input usually means that performance of these jobs is similar and stable. If one of your recurring jobs suddenly performs abnormally, fails, or slows down a lot, you might want to:

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-develop-usql-database.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ ms.date: 07/03/2018
1010
---
1111
# Use a U-SQL database project to develop a U-SQL database for Azure Data Lake
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
U-SQL database provides structured views over unstructured data and managed structured data in tables. It also provides a general metadata catalog system for organizing your structured data and custom code. The database is the concept that groups these related objects together.
1416

1517
Learn more about [U-SQL database and Data Definition Language (DDL)](/u-sql/data-definition-language-ddl-statements).

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-export-database.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 11/27/2017
99

1010
# Export a U-SQL database
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
In this article, learn how to use [Azure Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs) to export a U-SQL database as a single U-SQL script and downloaded resources. You can import the exported database to a local account in the same process.
1315

1416
Customers usually maintain multiple environments for development, test, and production. These environments are hosted on both a local account, on a developer's local computer, and in an Azure Data Lake Analytics account in Azure.

0 commit comments

Comments
 (0)