You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-lake-analytics/data-lake-analytics-data-lake-tools-get-started.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -96,7 +96,7 @@ After the job submission, the **Job view** tab opens to show the job progress.
96
96
***MetaData Operations** shows all the actions that were taken on the U-SQL catalog.
97
97
***Data** shows all the inputs and outputs.
98
98
***State History** shows the timeline and state details.
99
-
***AU Analysis** shows how many AUs were used in the job and explore simulations of different AU allocation strategies.
99
+
***AU Analysis** shows how many AUs (analytics units) were used in the job and explore simulations of different AU allocation strategies.
100
100
***Diagnostics** provides an advanced analysis for job execution and performance optimization.
101
101
102
102

@@ -117,7 +117,7 @@ To see the latest job status and refresh the screen, select **Refresh**.
117
117
118
118
1. In **Data Lake Analytics Explorer**, browse to the job you submitted.
119
119
120
-
1.Click the **Data** tab in your job.
120
+
1.Select the **Data** tab in your job.
121
121
122
122
1. In the **Job Outputs** tab, select the `"/data.csv"` file.
This article describes how to manage Azure Data Lake Analytics accounts, data sources, users, and jobs by using the Azure portal.
18
16
19
17
## Manage Data Lake Analytics accounts
20
18
21
19
### Create an account
22
20
23
21
1. Sign in to the [Azure portal](https://portal.azure.com).
24
-
2.Click**Create a resource**> **Intelligence + analytics** >**Data Lake Analytics**.
22
+
2.Select**Create a resource**and search for**Data Lake Analytics**.
25
23
3. Select values for the following items:
26
24
1.**Name**: The name of the Data Lake Analytics account.
27
25
2.**Subscription**: The Azure subscription used for the account.
28
26
3.**Resource Group**: The Azure resource group in which to create the account.
29
27
4.**Location**: The Azure datacenter for the Data Lake Analytics account.
30
28
5.**Data Lake Store**: The default store to be used for the Data Lake Analytics account. The Azure Data Lake Store account and the Data Lake Analytics account must be in the same location.
31
-
4.Click**Create**.
29
+
4.Select**Create**.
32
30
33
31
### Delete a Data Lake Analytics account
34
32
35
33
Before you delete a Data Lake Analytics account, delete its default Data Lake Store account.
36
34
37
35
1. In the Azure portal, go to your Data Lake Analytics account.
38
-
2.Click**Delete**.
36
+
2.Select**Delete**.
39
37
3. Type the account name.
40
-
4.Click**Delete**.
38
+
4.Select**Delete**.
41
39
42
-
<!-- ################################ -->
43
-
<!-- ################################ -->
44
40
45
41
## Manage data sources
46
42
@@ -54,11 +50,11 @@ You can use Data Explorer to browse data sources and perform basic file manageme
54
50
### Add a data source
55
51
56
52
1. In the Azure portal, go to your Data Lake Analytics account.
57
-
2.Click**Data Sources**.
58
-
3.Click**Add Data Source**.
53
+
2.Select**Data explorer**.
54
+
3.Select**Add Data Source**.
59
55
60
56
* To add a Data Lake Store account, you need the account name and access to the account to be able to query it.
61
-
* To add Azure Blob storage, you need the storage account and the account key. To find them, go to the storage account in the portal.
57
+
* To add Azure Blob storage, you need the storage account and the account key. To find them, go to the storage account in the portal and select **Access keys**.
62
58
63
59
## Set up firewall rules
64
60
@@ -69,20 +65,20 @@ If other Azure services, like Azure Data Factory or VMs, connect to the Data Lak
69
65
### Set up a firewall rule
70
66
71
67
1. In the Azure portal, go to your Data Lake Analytics account.
72
-
2. On the menu on the left, click**Firewall**.
68
+
2. On the menu on the left, select**Firewall**.
73
69
74
70
## Add a new user
75
71
76
72
You can use the **Add User Wizard** to easily provision new Data Lake users.
77
73
78
74
1. In the Azure portal, go to your Data Lake Analytics account.
79
-
2. On the left, under **Getting Started**, click**Add User Wizard**.
80
-
3. Select a user, and then click**Select**.
81
-
4. Select a role, and then click**Select**. To set up a new developer to use Azure Data Lake, select the **Data Lake Analytics Developer** role.
82
-
5. Select the access control lists (ACLs) for the U-SQL databases. When you're satisfied with your choices, click**Select**.
83
-
6. Select the ACLs for files. For the default store, don't change the ACLs for the root folder "/" and for the /system folder. Click**Select**.
84
-
7. Review all your selected changes, and then click**Run**.
85
-
8. When the wizard is finished, click**Done**.
75
+
2. On the left, under **Getting Started**, select**Add User Wizard**.
76
+
3. Select a user, and then select**Select**.
77
+
4. Select a role, and then select**Select**. To set up a new developer to use Azure Data Lake, select the **Data Lake Analytics Developer** role.
78
+
5. Select the access control lists (ACLs) for the U-SQL databases. When you're satisfied with your choices, select**Select**.
79
+
6. Select the ACLs for files. For the default store, don't change the ACLs for the root folder "/" and for the /system folder. select**Select**.
80
+
7. Review all your selected changes, and then select**Run**.
81
+
8. When the wizard is finished, select**Done**.
86
82
87
83
## Manage Azure role-based access control
88
84
@@ -94,6 +90,7 @@ The standard Azure roles have the following capabilities:
94
90
***Reader**: Can monitor jobs.
95
91
96
92
Use the Data Lake Analytics Developer role to enable U-SQL developers to use the Data Lake Analytics service. You can use the Data Lake Analytics Developer role to:
93
+
97
94
* Submit jobs.
98
95
* Monitor job status and the progress of jobs submitted by any user.
99
96
* See the U-SQL scripts from jobs submitted by any user.
@@ -115,46 +112,48 @@ Use the Data Lake Analytics Developer role to enable U-SQL developers to use the
115
112
>If a user or a security group needs to submit jobs, they also need permission on the store account. For more information, see [Secure data stored in Data Lake Store](../data-lake-store/data-lake-store-secure-data.md).
116
113
>
117
114
118
-
<!-- ################################ -->
119
-
<!-- ################################ -->
120
-
121
115
## Manage jobs
122
116
123
117
### Submit a job
124
118
125
119
1. In the Azure portal, go to your Data Lake Analytics account.
126
120
127
-
2.Click**New Job**. For each job, configure:
121
+
2.Select**New Job**. For each job, configure:
128
122
129
123
1.**Job Name**: The name of the job.
130
-
2.**Priority**: Lower numbers have higher priority. If two jobs are queued, the one with lower priority value runs first.
131
-
3.**Parallelism**: The maximum number of compute processes to reserve for this job.
124
+
2.**Priority**: This is under **More options**. Lower numbers have higher priority. If two jobs are queued, the one with lower priority value runs first.
125
+
3.**AUs**: The maximum number of Analytics Units, or compute processes to reserve for this job.
126
+
4.**Runtime**: Also under **More options**. Select the Default runtime unless you've received a custom runtime.
127
+
128
+
3. Add your script.
132
129
133
-
3. Click**Submit Job**.
130
+
4. Select**Submit Job**.
134
131
135
132
### Monitor jobs
136
133
137
134
1. In the Azure portal, go to your Data Lake Analytics account.
138
-
2.Click**View All Jobs**. A list of all the active and recently finished jobs in the account is shown.
139
-
3. Optionally, click**Filter** to help you find the jobs by **Time Range**, **Job Name**, and **Author** values.
135
+
2.Select**View All Jobs** at the top of the page. A list of all the active and recently finished jobs in the account is shown.
136
+
3. Optionally, select**Filter** to help you find the jobs by **Time Range**, **Status**, **Job Name**, **Job ID**, **Pipeline name** or **Pipeline ID**, **Recurrence name** or **Recurrence ID**, and **Author** values.
140
137
141
138
### Monitoring pipeline jobs
139
+
142
140
Jobs that are part of a pipeline work together, usually sequentially, to accomplish a specific scenario. For example, you can have a pipeline that cleans, extracts, transforms, aggregates usage for customer insights. Pipeline jobs are identified using the "Pipeline" property when the job was submitted. Jobs scheduled using ADF V2 will automatically have this property populated.
143
141
144
142
To view a list of U-SQL jobs that are part of pipelines:
145
143
146
144
1. In the Azure portal, go to your Data Lake Analytics accounts.
147
-
2.Click**Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
148
-
3.Click the **Pipeline Jobs** tab. A list of pipeline jobs will be shown along with aggregated statistics for each pipeline.
145
+
2.Select**Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
146
+
3.Select the **Pipeline Jobs** tab. A list of pipeline jobs will be shown along with aggregated statistics for each pipeline.
149
147
150
148
### Monitoring recurring jobs
149
+
151
150
A recurring job is one that has the same business logic but uses different input data every time it runs. Ideally, recurring jobs should always succeed, and have relatively stable execution time; monitoring these behaviors will help ensure the job is healthy. Recurring jobs are identified using the "Recurrence" property. Jobs scheduled using ADF V2 will automatically have this property populated.
152
151
153
152
To view a list of U-SQL jobs that are recurring:
154
153
155
154
1. In the Azure portal, go to your Data Lake Analytics accounts.
156
-
2.Click**Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
157
-
3.Click the **Recurring Jobs** tab. A list of recurring jobs will be shown along with aggregated statistics for each recurring job.
155
+
2.Select**Job Insights**. The "All Jobs" tab will be defaulted, showing a list of running, queued, and ended jobs.
156
+
3.Select the **Recurring Jobs** tab. A list of recurring jobs will be shown along with aggregated statistics for each recurring job.
0 commit comments