Skip to content

Commit fafcb33

Browse files
committed
Adding retirement flag to all pages
1 parent 5cd3f21 commit fafcb33

File tree

65 files changed

+138
-17
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

65 files changed

+138
-17
lines changed

articles/data-lake-analytics/data-lake-analytics-account-policies.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ Account policies help you control how resources an Azure Data Lake Analytics acc
1212

1313
These policies apply to all jobs in a Data Lake Analytics account.
1414

15+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
16+
1517
## Maximum number of AUs in a Data Lake Analytics account
1618

1719
A policy controls the total number of Analytics Units (AUs) your Data Lake Analytics account can use. By default, the value is set to 250. For example, if this value is set to 250 AUs, you can have one job running with 250 AUs assigned to it, or 10 jobs running with 25 AUs each. Additional jobs that are submitted are queued until the running jobs are finished. When running jobs are finished, AUs are freed up for the queued jobs to run.

articles/data-lake-analytics/data-lake-analytics-add-users.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 05/24/2018
99

1010
# Adding a user in the Azure portal
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## Start the Add User Wizard
1315
1. Open your Azure Data Lake Analytics via https://portal.azure.com.
1416
2. Click **Add User Wizard**.

articles/data-lake-analytics/data-lake-analytics-analyze-weblogs.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 12/05/2016
99
# Analyze Website logs using Azure Data Lake Analytics
1010
Learn how to analyze website logs using Data Lake Analytics, especially on finding out which referrers ran into errors when they tried to visit the website.
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## Prerequisites
1315
* **Visual Studio 2015 or Visual Studio 2013**.
1416
* **[Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs)**.

articles/data-lake-analytics/data-lake-analytics-cicd-manage-assemblies.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ ms.date: 10/30/2018
1212

1313
In this article, you learn how to manage U-SQL assembly source code with the newly introduced U-SQL database project. You also learn how to set up a continuous integration and deployment (CI/CD) pipeline for assembly registration by using Azure DevOps.
1414

15+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
16+
1517
## Use the U-SQL database project to manage assembly source code
1618

1719
[The U-SQL database project](data-lake-analytics-data-lake-tools-develop-usql-database.md) is a project type in Visual Studio that helps developers develop, manage, and deploy their U-SQL databases quickly and easily. You can manage all U-SQL database objects (except for credentials) with the U-SQL database project.

articles/data-lake-analytics/data-lake-analytics-cicd-overview.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,8 @@ ms.custom: devx-track-azurepowershell
1313

1414
In this article, you learn how to set up a continuous integration and deployment (CI/CD) pipeline for U-SQL jobs and U-SQL databases.
1515

16+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
17+
1618
[!INCLUDE [updated-for-az](../../includes/updated-for-az.md)]
1719

1820
## Use CI/CD for U-SQL jobs

articles/data-lake-analytics/data-lake-analytics-cicd-test.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,8 @@ ms.date: 08/30/2019
1212

1313
Azure Data Lake provides the [U-SQL](data-lake-analytics-u-sql-get-started.md) language. U-SQL combines declarative SQL with imperative C# to process data at any scale. In this document, you learn how to create test cases for U-SQL and extended C# user-defined operator (UDO) code.
1414

15+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
16+
1517
## Test U-SQL scripts
1618

1719
The U-SQL script is compiled and optimized for executable code to run in Azure or on your local computer. The compilation and optimization process treats the entire U-SQL script as a whole. You can't do a traditional unit test for every statement. However, by using the U-SQL test SDK and the local run SDK, you can do script-level tests.

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-data-skew-solutions.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 12/16/2016
99

1010
# Resolve data-skew problems by using Azure Data Lake Tools for Visual Studio
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
## What is data skew?
1315

1416
Briefly stated, data skew is an over-represented value. Imagine that you have assigned 50 tax examiners to audit tax returns, one examiner for each US state. The Wyoming examiner, because the population there is small, has little to do. In California, however, the examiner is kept very busy because of the state's large population.

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-debug-recurring-job.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 05/20/2018
99

1010
# Troubleshoot an abnormal recurring job
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
This article shows how to use [Azure Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs) to troubleshoot problems with recurring jobs. Learn more about pipeline and recurring jobs from the [Azure Data Lake and Azure HDInsight blog](/archive/blogs/azuredatalake/managing-pipeline-recurring-jobs-in-azure-data-lake-analytics-made-easy).
1315

1416
Recurring jobs usually share the same query logic and similar input data. For example, imagine that you have a recurring job running every Monday morning at 8 A.M. to count last week’s weekly active user. The scripts for these jobs share one script template that contains the query logic. The inputs for these jobs are the usage data for last week. Sharing the same query logic and similar input usually means that performance of these jobs is similar and stable. If one of your recurring jobs suddenly performs abnormally, fails, or slows down a lot, you might want to:

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-develop-usql-database.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,8 @@ ms.date: 07/03/2018
1010
---
1111
# Use a U-SQL database project to develop a U-SQL database for Azure Data Lake
1212

13+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
14+
1315
U-SQL database provides structured views over unstructured data and managed structured data in tables. It also provides a general metadata catalog system for organizing your structured data and custom code. The database is the concept that groups these related objects together.
1416

1517
Learn more about [U-SQL database and Data Definition Language (DDL)](/u-sql/data-definition-language-ddl-statements).

articles/data-lake-analytics/data-lake-analytics-data-lake-tools-export-database.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ ms.date: 11/27/2017
99

1010
# Export a U-SQL database
1111

12+
[!INCLUDE [retirement-flag](includes/retirement-flag.md)]
13+
1214
In this article, learn how to use [Azure Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs) to export a U-SQL database as a single U-SQL script and downloaded resources. You can import the exported database to a local account in the same process.
1315

1416
Customers usually maintain multiple environments for development, test, and production. These environments are hosted on both a local account, on a developer's local computer, and in an Azure Data Lake Analytics account in Azure.

0 commit comments

Comments
 (0)