You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: articles/data-factory/author-global-parameters.md
+5-2Lines changed: 5 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,7 +6,7 @@ ms.subservice: authoring
6
6
ms.topic: conceptual
7
7
author: joshuha-msft
8
8
ms.author: joowen
9
-
ms.date: 05/12/2021
9
+
ms.date: 01/31/2022
10
10
ms.custom: devx-track-azurepowershell
11
11
---
12
12
@@ -30,6 +30,8 @@ After a global parameter is created, you can edit it by clicking the parameter's
30
30
31
31
:::image type="content" source="media/author-global-parameters/create-global-parameter-3.png" alt-text="Create global parameters":::
32
32
33
+
Global parameters are stored as part of the /factory/{factory_name}-arm-template parameters.json.
34
+
33
35
## Using global parameters in a pipeline
34
36
35
37
Global parameters can be used in any [pipeline expression](control-flow-expression-language-functions.md). If a pipeline is referencing another resource such as a dataset or data flow, you can pass down the global parameter value via that resource's parameters. Global parameters are referenced as `pipeline().globalParameters.<parameterName>`.
**Cause:** When querying across multiple partitions/pages, backend service returns continuation token in JObject format with 3 properties: **token, min and max key ranges**, for instance, {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}). Depending on source data, querying can result 0 indicating missing token though there is more data to fetch.
1029
+
1030
+
**Recommendation:** When the continuationToken is non-null, as the string {\"token\":null,\"range\":{\"min\":\"05C1E9AB0DAD76\",\"max":\"05C1E9CD673398"}}, it is required to call queryActivityRuns API again with the continuation token from the previous response. You need to pass the full string for the query API again. The activities will be returned in the subsequent pages for the query result. You should ignore that there is empty array in this page, as long as the full continuationToken value != null, you need continue querying. For more details, please refer to [REST api for pipeline run query.](/rest/api/datafactory/activity-runs/query-by-pipeline-run)
1031
+
1032
+
1024
1033
### Activity stuck issue
1025
1034
1026
1035
When you observe that the activity is running much longer than your normal runs with barely no progress, it may happen to be stuck. You can try canceling it and retry to see if it helps. If it's a copy activity, you can learn about the performance monitoring and troubleshooting from [Troubleshoot copy activity performance](copy-activity-performance-troubleshooting.md); if it's a data flow, learn from [Mapping data flows performance](concepts-data-flow-performance.md) and tuning guide.
@@ -1051,4 +1060,4 @@ For more troubleshooting help, try these resources:
1051
1060
* [Stack Overflow forum for Data Factory](https://stackoverflow.com/questions/tagged/azure-data-factory)
1052
1061
* [Twitter information about Data Factory](https://twitter.com/hashtag/DataFactory)
Copy file name to clipboardExpand all lines: articles/data-factory/data-flow-troubleshoot-errors.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,7 +7,7 @@ ms.reviewer: daperlov
7
7
ms.service: data-factory
8
8
ms.subservice: data-flows
9
9
ms.topic: troubleshooting
10
-
ms.date: 10/01/2021
10
+
ms.date: 01/21/2022
11
11
---
12
12
13
13
# Common error codes and messages
@@ -33,7 +33,7 @@ This article lists common error codes and messages reported by mapping data flow
33
33
34
34
If you're running the data flow in a debug test execution from a debug pipeline run, you might run into this condition more frequently. That's because Azure Data Factory throttles the broadcast timeout to 60 seconds to maintain a faster debugging experience. You can extend the timeout to the 300-second timeout of a triggered run. To do so, you can use the **Debug** > **Use Activity Runtime** option to use the Azure IR defined in your Execute Data Flow pipeline activity.
35
35
36
-
-**Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
36
+
-**Message**: Broadcast join timeout error, you can choose 'Off' of broadcast option in join/exists/lookup transformation to avoid this issue. If you intend to broadcast join option to improve performance, then make sure broadcast stream can produce data within 60 secs in debug runs and 300 secs in job runs.
37
37
-**Cause**: Broadcast has a default timeout of 60 seconds in debug runs and 300 seconds in job runs. On the broadcast join, the stream chosen for broadcast is too large to produce data within this limit. If a broadcast join isn't used, the default broadcast by dataflow can reach the same limit.
38
38
-**Recommendation**: Turn off the broadcast option or avoid broadcasting large data streams for which the processing can take more than 60 seconds. Choose a smaller stream to broadcast. Large Azure SQL Data Warehouse tables and source files aren't typically good choices. In the absence of a broadcast join, use a larger cluster if this error occurs.
39
39
@@ -49,7 +49,7 @@ This article lists common error codes and messages reported by mapping data flow
49
49
-**Recommendation**: Set an alias if you're using a SQL function like min() or max().
50
50
51
51
## Error code: DF-Executor-DriverError
52
-
-**Message**: INT96 is legacy timestamp type which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
52
+
-**Message**: INT96 is legacy timestamp type, which is not supported by ADF Dataflow. Please consider upgrading the column type to the latest types.
53
53
-**Cause**: Driver error.
54
54
-**Recommendation**: INT96 is a legacy timestamp type that's not supported by Azure Data Factory data flow. Consider upgrading the column type to the latest type.
55
55
@@ -59,7 +59,7 @@ This article lists common error codes and messages reported by mapping data flow
59
59
-**Recommendation**: Contact the Microsoft product team for more details about this problem.
-**Message**: The specified source path has either multiple partitioned directories (for e.g.<Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/<Partition Root Directory 2>/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example <Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
62
+
-**Message**: The specified source path has either multiple partitioned directories (for example,<Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/<Partition Root Directory 2>/c=10/d=30) or partitioned directory with other file or non-partitioned directory (for example <Source Path>/<Partition Root Directory 1>/a=10/b=20, <Source Path>/Directory 2/file1), remove partition root directory from source path and read it through separate source transformation.
63
63
-**Cause**: The source path has either multiple partitioned directories or a partitioned directory that has another file or non-partitioned directory.
64
64
-**Recommendation**: Remove the partitioned root directory from the source path and read it through separate source transformation.
65
65
@@ -125,7 +125,7 @@ This article lists common error codes and messages reported by mapping data flow
125
125
## Error code: InvalidTemplate
126
126
-**Message**: The pipeline expression cannot be evaluated.
127
127
-**Cause**: The pipeline expression passed in the Data Flow activity isn't being processed correctly because of a syntax error.
128
-
-**Recommendation**: Check your activity in activity monitoring to verify the expression.
128
+
-**Recommendation**: Check data flow activity name. Check expressions in activity monitoring to verify the expressions. For example, data flow activity name can not have a space or a hyphen.
129
129
130
130
## Error code: 2011
131
131
-**Message**: The activity was running on Azure Integration Runtime and failed to decrypt the credential of data store or compute connected via a Self-hosted Integration Runtime. Please check the configuration of linked services associated with this activity, and make sure to use the proper integration runtime type.
@@ -248,7 +248,7 @@ This article lists common error codes and messages reported by mapping data flow
-**Message**: Blob storage staging properties should be specified.
250
250
-**Cause**: An invalid staging configuration is provided in the Hive.
251
-
-**Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service which is used as staging.
251
+
-**Recommendation**: Please check if the account key, account name and container are set properly in the related Blob linked service, which is used as staging.
0 commit comments