Skip to content

Commit 0f6e1b4

Browse files
authored
Merge pull request #724 from cloudsufi/Additional-e2e-wrangler
Additional e2e scenarios for Wrangler
2 parents 61bd71c + c81b5b8 commit 0f6e1b4

File tree

5 files changed

+154
-3
lines changed

5 files changed

+154
-3
lines changed
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
# Copyright © 2025 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: Wrangler - Verify Wrangler Plugin Error scenarios
17+
18+
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @Wrangler_Required
19+
Scenario: Verify Wrangler Plugin error when user selects Precondition Language as SQL
20+
Given Open Datafusion Project to configure pipeline
21+
Then Click on the Plus Green Button to import the pipelines
22+
Then Select the file for importing the pipeline for the plugin "Directive_parse_csv"
23+
Then Navigate to the properties page of plugin: "BigQueryTable"
24+
Then Replace input plugin property: "project" with value: "projectId"
25+
Then Replace input plugin property: "dataset" with value: "dataset"
26+
Then Replace input plugin property: "table" with value: "bqSourceTable"
27+
Then Click on the Get Schema button
28+
Then Validate "BigQueryTable" plugin properties
29+
Then Close the Plugin Properties page
30+
Then Navigate to the properties page of plugin: "Wrangler"
31+
Then Select radio button plugin property: "expressionLanguage" with value: "sql"
32+
Then Click on the Validate button
33+
Then Verify that the Plugin Property: "directives" is displaying an in-line error message: "errorMessageSqlError"
34+
35+
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @BQ_CONNECTION @Wrangler_Required
36+
Scenario: Verify Wrangler Plugin error when user provides invalid input field Name
37+
Given Open Wrangler connections page
38+
Then Click plugin property: "addConnection" button
39+
Then Click plugin property: "bqConnectionRow"
40+
Then Enter input plugin property: "name" with value: "bqConnectionName"
41+
Then Replace input plugin property: "projectId" with value: "projectId"
42+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
43+
Then Override Service account details in Wrangler connection page if set in environment variables
44+
Then Click plugin property: "testConnection" button
45+
Then Verify the test connection is successful
46+
Then Click plugin property: "connectionCreate" button
47+
Then Verify the connection with name: "bqConnectionName" is created successfully
48+
Then Select connection data row with name: "dataset"
49+
Then Select connection data row with name: "bqSourceTable"
50+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
51+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
52+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
53+
Then Navigate to the properties page of plugin: "Wrangler"
54+
Then Replace input plugin property: "field" with value: "invalid"
55+
Then Click on the Validate button
56+
Then Verify that the Plugin Property: "field" is displaying an in-line error message: "errorMessageInvalidInputFieldName"
57+
Given Open Wrangler connections page
58+
Then Expand connections of type: "BigQuery"
59+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
60+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
61+
Then Click plugin property: "Delete" button
62+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
Lines changed: 83 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,83 @@
1+
# Copyright © 2025 Cask Data, Inc.
2+
#
3+
# Licensed under the Apache License, Version 2.0 (the "License"); you may not
4+
# use this file except in compliance with the License. You may obtain a copy of
5+
# the License at
6+
#
7+
# http://www.apache.org/licenses/LICENSE-2.0
8+
#
9+
# Unless required by applicable law or agreed to in writing, software
10+
# distributed under the License is distributed on an "AS IS" BASIS, WITHOUT
11+
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
12+
# License for the specific language governing permissions and limitations under
13+
# the License.
14+
15+
@Wrangler
16+
Feature: Wrangler - Run time scenarios for wrangler plugin using macro arguments
17+
18+
@BQ_SOURCE_CSV_TEST @BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_CONNECTION
19+
Scenario: To verify User is able to run a pipeline using macro arguments
20+
Given Open Wrangler connections page
21+
Then Click plugin property: "addConnection" button
22+
Then Click plugin property: "bqConnectionRow"
23+
Then Enter input plugin property: "name" with value: "bqConnectionName"
24+
Then Replace input plugin property: "projectId" with value: "projectId"
25+
Then Enter input plugin property: "datasetProjectId" with value: "projectId"
26+
Then Override Service account details in Wrangler connection page if set in environment variables
27+
Then Click plugin property: "testConnection" button
28+
Then Verify the test connection is successful
29+
Then Click plugin property: "connectionCreate" button
30+
Then Verify the connection with name: "bqConnectionName" is created successfully
31+
Then Select connection data row with name: "dataset"
32+
Then Select connection data row with name: "bqSourceTable"
33+
Then Verify connection datatable is displayed for the data: "bqSourceTable"
34+
Then Expand dropdown column: "body" and apply directive: "Parse" as "CSV" with: "Comma" option
35+
Then Expand dropdown column: "body_3" and apply directive: "FillNullOrEmptyCells" as "shubh"
36+
Then Enter directive from CLI "rename body_1 new_id"
37+
Then Enter directive from CLI "quantize body_4 body_q 1:2=20,3:4=40"
38+
Then Expand dropdown column: "body_4" and apply directive: "ChangeDataType" as "Integer"
39+
Then Enter directive from CLI "columns-replace s/^new_//g"
40+
Then Enter directive from CLI "set-headers :abc"
41+
Then Enter directive from CLI "change-column-case uppercase"
42+
Then Enter directive from CLI "cleanse-column-names "
43+
Then Enter directive from CLI "split-to-rows :id '#'"
44+
Then Click Create Pipeline button and choose the type of pipeline as: "Batch pipeline"
45+
Then Verify plugin: "BigQueryTable" node is displayed on the canvas with a timeout of 120 seconds
46+
Then Navigate to the properties page of plugin: "Wrangler"
47+
Then Click on the Macro button of Property: "field" and set the value to: "fields"
48+
Then Click on the Macro button of Property: "expressionLanguage" and set the value to: "expressionLanguage"
49+
Then Click on the Macro button of Property: "precondition" and set the value to: "precondition"
50+
Then Click on the Macro button of Property: "on-error" and set the value to: "on-error"
51+
Then Validate "Wrangler" plugin properties
52+
Then Close the Plugin Properties page
53+
Then Expand Plugin group in the LHS plugins list: "Sink"
54+
Then Select plugin: "BigQuery" from the plugins list as: "Sink"
55+
Then Navigate to the properties page of plugin: "BigQuery2"
56+
Then Click plugin property: "useConnection"
57+
Then Click on the Browse Connections button
58+
Then Select connection: "bqConnectionName"
59+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
60+
Then Enter input plugin property: "dataset" with value: "dataset"
61+
Then Enter input plugin property: "table" with value: "bqTargetTable"
62+
Then Validate "BigQuery" plugin properties
63+
Then Close the Plugin Properties page
64+
Then Connect plugins: "Wrangler" and "BigQuery2" to establish connection
65+
Then Save the pipeline
66+
Then Deploy the pipeline
67+
Then Run the Pipeline in Runtime
68+
Then Enter runtime argument value "expressionLanguage" for key "expressionLanguage"
69+
Then Enter runtime argument value "fields" for key "fields"
70+
Then Enter runtime argument value "precondition" for key "precondition"
71+
Then Enter runtime argument value "onError" for key "on-error"
72+
Then Run the Pipeline in Runtime with runtime arguments
73+
Then Wait till pipeline is in running state
74+
Then Open and capture logs
75+
Then Verify the pipeline status is "Succeeded"
76+
Then Close the pipeline logs
77+
Then Validate The Data From BQ To BQ With Actual And Expected File for: "ExpectedDirective_parse_csv"
78+
Given Open Wrangler connections page
79+
Then Expand connections of type: "BigQuery"
80+
Then Open action menu for connection: "bqConnectionName" of type: "BigQuery"
81+
Then Select action: "Delete" for connection: "bqConnectionName" of type: "BigQuery"
82+
Then Click plugin property: "Delete" button
83+
Then Verify connection: "bqConnectionName" of type: "BigQuery" is deleted successfully
Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,3 @@
1-
{"create_date":"2024","id":"1","timecolumn":"2006-03-18"}
2-
{"create_date":"2024","id":"2","timecolumn":"2007-03-18"}
3-
{"create_date":"2024","id":"3","timecolumn":"2008-04-19"}
1+
{"create_date":"2025","id":"1","timecolumn":"2006-03-18"}
2+
{"create_date":"2025","id":"2","timecolumn":"2007-03-18"}
3+
{"create_date":"2025","id":"3","timecolumn":"2008-04-19"}
Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1 +1,3 @@
11
validationSuccessMessage=No errors found.
2+
errorMessageInvalidInputFieldName=Field 'invalid' must be present in input schema.
3+
errorMessageSqlError=Directives are not supported for precondition of type SQL

wrangler-transform/src/e2e-test/resources/pluginParameters.properties

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,10 @@ bqTargetTable=dummy
1717
sourcePath=example/hello.csv
1818
gcsSourceBucket=dummy
1919
testFile=BQtesdata/BigQuery/test1.xlsx
20+
fields=*
21+
expressionLanguage=JEXL
22+
onError=Fail pipeline
23+
precondition=false
2024
#bq queries file path
2125

2226
CreateBQDataQueryFileFxdLen=BQtesdata/BigQuery/BigQueryCreateTableQueryFxdlen.txt

0 commit comments

Comments
 (0)