You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
1. Open your Azure Data Lake Analytics via https://portal.azure.com.
16
-
2. Click **Add User Wizard**.
17
-
3. In the **Select user** step, find the user you want to add. Click **Select**.
18
-
4. the **Select role** step, pick **Data Lake Analytics Developer**. This role has the minimum set of permissions required to submit/monitor/manage U-SQL jobs. Assign to this role if the group is not intended for managing Azure services.
19
-
5. In the **Select catalog permissions** step, select any additional databases that user will need access to. Read and Write Access to the default static database called "master" is required to submit jobs. When you are done, click **OK**.
20
-
6. In the final step called **Assign selected permissions** review the changes the wizard will make. Click **OK**.
21
15
16
+
1. Open your Azure Data Lake Analytics via https://portal.azure.com.
17
+
2. Select **Add User Wizard**.
18
+
3. In the **Select user** step, find the user you want to add. Select **Select**.
19
+
4. the **Select role** step, pick **Data Lake Analytics Developer**. This role has the minimum set of permissions required to submit/monitor/manage U-SQL jobs. Assign to this role if the group isn't intended for managing Azure services.
20
+
5. In the **Select catalog permissions** step, select any other databases that user will need access to. Read and Write Access to the default static database called "master" is required to submit jobs. When you're done, select **OK**.
21
+
6. In the final step called **Assign selected permissions** review the changes the wizard will make. Select **OK**.
22
22
23
23
## Configure ACLs for data folders
24
+
24
25
Grant "R-X" or "RWX", as needed, on folders containing input data and output data.
25
26
27
+
## Optionally, add the user to the Azure Data Lake Storage Gen1 role **Reader** role
26
28
27
-
## Optionally, add the user to the Azure Data Lake Storage Gen1 role **Reader** role.
28
-
1. Find your Azure Data Lake Storage Gen1 account.
29
-
2. Click on **Users**.
30
-
3. Click **Add**.
31
-
4. Select an Azure role to assign this group.
32
-
5. Assign to Reader role. This role has the minimum set of permissions required to browse/manage data stored in ADLSGen1. Assign to this role if the Group is not intended for managing Azure services.
33
-
6. Type in the name of the Group.
34
-
7. Click **OK**.
29
+
1. Find your Azure Data Lake Storage Gen1 account.
30
+
2. Select **Users**.
31
+
3. Select **Add**.
32
+
4. Select an Azure role to assign this group.
33
+
5. Assign to Reader role. This role has the minimum set of permissions required to browse/manage data stored in ADLSGen1. Assign to this role if the Group isn't intended for managing Azure services.
34
+
6. Type in the name of the Group.
35
+
7. Select **OK**.
35
36
36
37
## Adding a user using PowerShell
37
38
38
39
1. Follow the instructions in this guide: [How to install and configure Azure PowerShell](/powershell/azure/).
39
40
2. Download the [Add-AdlaJobUser.ps1](https://github.com/Azure/AzureDataLake/blob/master/Samples/PowerShell/ADLAUsers/Add-AdlaJobUser.ps1) PowerShell script.
40
-
3. Run the PowerShell script.
41
+
3. Run the PowerShell script.
41
42
42
43
The sample command to give user access to submit jobs, view new job metadata, and view old metadata is:
43
44
44
45
`Add-AdlaJobUser.ps1 -Account myadlsaccount -EntityToAdd 546e153e-0ecf-417b-ab7f-aa01ce4a7bff -EntityType User -FullReplication`
45
46
46
-
47
47
## Next steps
48
48
49
49
*[Overview of Azure Data Lake Analytics](data-lake-analytics-overview.md)
Copy file name to clipboardExpand all lines: articles/data-lake-analytics/data-lake-analytics-analyze-weblogs.md
+16-16Lines changed: 16 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,10 +1,10 @@
1
1
---
2
2
title: Analyze Website logs using Azure Data Lake Analytics
3
3
description: Learn how to analyze website logs using Azure Data Lake Analytics to run U-SQL functions and queries.
4
-
ms.reviewer: jasonh
4
+
ms.reviewer: whhender
5
5
ms.service: data-lake-analytics
6
6
ms.topic: how-to
7
-
ms.date: 12/05/2016
7
+
ms.date: 01/20/2023
8
8
---
9
9
# Analyze Website logs using Azure Data Lake Analytics
10
10
Learn how to analyze website logs using Data Lake Analytics, especially on finding out which referrers ran into errors when they tried to visit the website.
@@ -15,29 +15,29 @@ Learn how to analyze website logs using Data Lake Analytics, especially on findi
15
15
***Visual Studio 2015 or Visual Studio 2013**.
16
16
***[Data Lake Tools for Visual Studio](https://aka.ms/adltoolsvs)**.
17
17
18
-
Once Data Lake Tools for Visual Studio is installed, you will see a **Data Lake** item in the **Tools** menu in Visual Studio:
18
+
Once Data Lake Tools for Visual Studio is installed, you'll see a **Data Lake** item in the **Tools** menu in Visual Studio:
19
19
20
20

21
21
***Basic knowledge of Data Lake Analytics and the Data Lake Tools for Visual Studio**. To get started, see:
22
22
23
23
*[Develop U-SQL script using Data Lake tools for Visual Studio](data-lake-analytics-data-lake-tools-get-started.md).
24
24
***A Data Lake Analytics account.** See [Create an Azure Data Lake Analytics account](data-lake-analytics-get-started-portal.md).
25
-
***Install the sample data.** In the Azure Portal, open you Data Lake Analytics account and click**Sample Scripts** on the left menu, then click**Copy Sample Data**.
25
+
***Install the sample data.** In the Azure portal, open your Data Lake Analytics account and select**Sample Scripts** on the left menu, then select**Copy Sample Data**.
26
26
27
27
## Connect to Azure
28
28
Before you can build and test any U-SQL scripts, you must first connect to Azure.
29
29
30
30
### To connect to Data Lake Analytics
31
31
32
32
1. Open Visual Studio.
33
-
2.Click**Data Lake > Options and Settings**.
34
-
3.Click**Sign In**, or **Change User** if someone has signed in, and follow the instructions.
35
-
4.Click**OK** to close the Options and Settings dialog.
33
+
2.Select**Data Lake > Options and Settings**.
34
+
3.Select**Sign In**, or **Change User** if someone has signed in, and follow the instructions.
35
+
4.Select**OK** to close the Options and Settings dialog.
36
36
37
37
### To browse your Data Lake Analytics accounts
38
38
39
39
1. From Visual Studio, open **Server Explorer** by press **CTRL+ALT+S**.
40
-
2. From **Server Explorer**, expand **Azure**, and then expand **Data Lake Analytics**. You shall see a list of your Data Lake Analytics accounts if there are any. You cannot create Data Lake Analytics accounts from the studio. To create an account, see [Get Started with Azure Data Lake Analytics using Azure Portal](data-lake-analytics-get-started-portal.md) or [Get Started with Azure Data Lake Analytics using Azure PowerShell](data-lake-analytics-get-started-powershell.md).
40
+
2. From **Server Explorer**, expand **Azure**, and then expand **Data Lake Analytics**. You shall see a list of your Data Lake Analytics accounts if there are any. You can't create Data Lake Analytics accounts from the studio. To create an account, see [Get Started with Azure Data Lake Analytics using Azure portal](data-lake-analytics-get-started-portal.md) or [Get Started with Azure Data Lake Analytics using Azure PowerShell](data-lake-analytics-get-started-powershell.md).
41
41
42
42
## Develop U-SQL application
43
43
A U-SQL application is mostly a U-SQL script. To learn more about U-SQL, see [Get started with U-SQL](data-lake-analytics-u-sql-get-started.md).
@@ -46,13 +46,13 @@ You can add addition user-defined operators to the application. For more inform
46
46
47
47
### To create and submit a Data Lake Analytics job
48
48
49
-
1.Click the **File > New > Project**.
49
+
1.Select the **File > New > Project**.
50
50
51
51
2. Select the U-SQL Project type.
52
52
53
53

54
54
55
-
3.Click**OK**. Visual studio creates a solution with a Script.usql file.
55
+
3.Select**OK**. Visual studio creates a solution with a Script.usql file.
56
56
57
57
4. Enter the following script into the Script.usql file:
58
58
@@ -153,13 +153,13 @@ You can add addition user-defined operators to the application. For more inform
153
153
154
154
6. Switch back to the first U-SQL script and next to the **Submit** button, specify your Analytics account.
155
155
156
-
7. From **Solution Explorer**, right click**Script.usql**, and then click**Build Script**. Verify the results in the Output pane.
156
+
7. From **Solution Explorer**, right select**Script.usql**, and then select**Build Script**. Verify the results in the Output pane.
157
157
158
-
8. From **Solution Explorer**, right click**Script.usql**, and then click**Submit Script**.
158
+
8. From **Solution Explorer**, right select**Script.usql**, and then select**Submit Script**.
159
159
160
-
9. Verify the **Analytics Account** is the one where you want to run the job, and then click**Submit**. Submission results and job link are available in the Data Lake Tools for Visual Studio Results window when the submission is completed.
160
+
9. Verify the **Analytics Account** is the one where you want to run the job, and then select**Submit**. Submission results and job link are available in the Data Lake Tools for Visual Studio Results window when the submission is completed.
161
161
162
-
10. Wait until the job is completed successfully. If the job failed, it is most likely missing the source file. Please see the Prerequisite section of this tutorial. For additional troubleshooting information, see [Monitor and troubleshoot Azure Data Lake Analytics jobs](data-lake-analytics-monitor-and-troubleshoot-jobs-tutorial.md).
162
+
10. Wait until the job is completed successfully. If the job failed, it's most likely missing the source file. See the Prerequisite section of this tutorial. For more troubleshooting information, see [Monitor and troubleshoot Azure Data Lake Analytics jobs](data-lake-analytics-monitor-and-troubleshoot-jobs-tutorial.md).
163
163
164
164
When the job is completed, you shall see the following screen:
165
165
@@ -169,14 +169,14 @@ You can add addition user-defined operators to the application. For more inform
169
169
170
170
### To see the job output
171
171
172
-
1. From **Server Explorer**, expand **Azure**, expand **Data Lake Analytics**, expand your Data Lake Analytics account, expand **Storage Accounts**, right-click the default Data Lake Storage account, and then click**Explorer**.
172
+
1. From **Server Explorer**, expand **Azure**, expand **Data Lake Analytics**, expand your Data Lake Analytics account, expand **Storage Accounts**, right-click the default Data Lake Storage account, and then select**Explorer**.
173
173
2. Double-click **Samples** to open the folder, and then double-click **Outputs**.
174
174
3. Double-click **UnsuccessfulResponses.log**.
175
175
4. You can also double-click the output file inside the graph view of the job in order to navigate directly to the output.
176
176
177
177
## Next steps
178
178
To get started with Data Lake Analytics using different tools, see:
179
179
180
-
*[Get started with Data Lake Analytics using Azure Portal](data-lake-analytics-get-started-portal.md)
180
+
*[Get started with Data Lake Analytics using Azure portal](data-lake-analytics-get-started-portal.md)
181
181
*[Get started with Data Lake Analytics using Azure PowerShell](data-lake-analytics-get-started-powershell.md)
182
182
*[Get started with Data Lake Analytics using .NET SDK](./data-lake-analytics-get-started-cli.md)
Copy file name to clipboardExpand all lines: articles/data-lake-analytics/data-lake-analytics-cicd-overview.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: How to set up a CI/CD pipeline for Azure Data Lake Analytics
3
3
description: Learn how to set up continuous integration and continuous deployment for Azure Data Lake Analytics.
4
4
ms.service: data-lake-analytics
5
5
ms.topic: how-to
6
-
ms.date: 09/14/2018
6
+
ms.date: 01/20/2023
7
7
ms.custom: devx-track-azurepowershell
8
8
---
9
9
# How to set up a CI/CD pipeline for Azure Data Lake Analytics
@@ -62,7 +62,7 @@ Learn more about [U-SQL database project](data-lake-analytics-data-lake-tools-de
62
62
63
63
### Build a U-SQL project with the MSBuild command line
64
64
65
-
First migrate the project and get the NuGet package. Then call the standard MSBuild command line with the following additional arguments to build your U-SQL project:
65
+
First migrate the project and get the NuGet package. Then call the standard MSBuild command line with the following arguments to build your U-SQL project:
@@ -312,7 +312,7 @@ To add the NuGet package reference, right-click the solution in Visual Studio So
312
312
313
313
### Build U-SQL a database project with the MSBuild command line
314
314
315
-
To build your U-SQL database project, call the standard MSBuild command line and pass the U-SQL SDK NuGet package reference as an additional argument. See the following example:
315
+
To build your U-SQL database project, call the standard MSBuild command line and pass the U-SQL SDK NuGet package reference as another argument. See the following example:
@@ -340,7 +340,7 @@ In addition to the command line, you can use Visual Studio Build or an MSBuild t
340
340
341
341
### U-SQL database project build output
342
342
343
-
The build output for a U-SQL database project is a U-SQL database deployment package, named with the suffix `.usqldbpack`. The `.usqldbpack` package is a zip file that includes all DDL statements in a single U-SQL script in a DDL folder. It includes all **.dlls** and additional files for assembly in a temp folder.
343
+
The build output for a U-SQL database project is a U-SQL database deployment package, named with the suffix `.usqldbpack`. The `.usqldbpack` package is a zip file that includes all DDL statements in a single U-SQL script in a DDL folder. It includes all **.dlls** and other files for assembly in a temp folder.
344
344
345
345
## Test table-valued functions and stored procedures
346
346
@@ -478,9 +478,9 @@ Take the following steps to set up a database deployment task in Azure Pipelines
478
478
|AzureSDKPath|The path to search dependent assemblies in the Azure SDK.|null|true|
479
479
|Interactive|Whether or not to use interactive mode for authentication.|false|false|
480
480
|ClientId|The Azure AD application ID required for non-interactive authentication.|null|Required for non-interactive authentication.|
481
-
|Secrete|The secrete or password for non-interactive authentication. It should be used only in a trusted and secure environment.|null|Required for non-interactive authentication, or else use SecreteFile.|
482
-
|SecreteFile|The file saves the secrete or password for non-interactive authentication. Make sure to keep it readable only by the current user.|null|Required for non-interactive authentication, or else use Secrete.|
483
-
|CertFile|The file saves X.509 certification for non-interactive authentication. The default is to use client secrete authentication.|null|false|
481
+
|Secret|The secret or password for non-interactive authentication. It should be used only in a trusted and secure environment.|null|Required for non-interactive authentication, or else use SecreteFile.|
482
+
|SecretFile|The file saves the secret or password for non-interactive authentication. Make sure to keep it readable only by the current user.|null|Required for non-interactive authentication, or else use Secret.|
483
+
|CertFile|The file saves X.509 certification for non-interactive authentication. The default is to use client secret authentication.|null|false|
484
484
| JobPrefix | The prefix for database deployment of a U-SQL DDL job. | Deploy_ + DateTime.Now | false |
Copy file name to clipboardExpand all lines: articles/data-lake-analytics/data-lake-analytics-cicd-test.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -3,7 +3,7 @@ title: How to test your Azure Data Lake Analytics code
3
3
description: 'Learn how to add test cases for U-SQL and extended C# code for Azure Data Lake Analytics.'
4
4
ms.service: data-lake-analytics
5
5
ms.topic: how-to
6
-
ms.date: 08/30/2019
6
+
ms.date: 01/20/2023
7
7
---
8
8
# Test your Azure Data Lake Analytics code
9
9
@@ -136,7 +136,7 @@ For a C# UDO test, make sure to reference the following assemblies, which are ne
136
136
- Microsoft.Analytics.Types
137
137
- Microsoft.Analytics.UnitTest
138
138
139
-
Ifyoureferencethemthrough [theNugetpackageMicrosoft.Azure.DataLake.USQL.Interfaces](https://www.nuget.org/packages/Microsoft.Azure.DataLake.USQL.Interfaces/), make sure you add a NuGet Restore task in your build pipeline.
139
+
Ifyoureferencethemthrough [theNuGetpackageMicrosoft.Azure.DataLake.USQL.Interfaces](https://www.nuget.org/packages/Microsoft.Azure.DataLake.USQL.Interfaces/), make sure you add a NuGet Restore task in your build pipeline.
0 commit comments