Skip to content

Commit d5b19d5

Browse files
committed
Additional e2e tests for bq sink
1 parent 5514139 commit d5b19d5

File tree

4 files changed

+260
-0
lines changed

4 files changed

+260
-0
lines changed

src/e2e-test/features/bigquery/sink/BigQuerySinkError.feature

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -62,3 +62,17 @@ Feature: BigQuery sink - Validate BigQuery sink plugin error scenarios
6262
Then Enter BigQuery sink property table name
6363
Then Enter BigQuery property temporary bucket name "bqInvalidTemporaryBucket"
6464
Then Verify the BigQuery validation error message for invalid property "bucket"
65+
66+
@BQ_SINK_TEST
67+
Scenario:Verify BigQuery Sink properties validation errors for incorrect value of reference name
68+
Given Open Datafusion Project to configure pipeline
69+
When Sink is BigQuery
70+
Then Open BigQuery sink properties
71+
And Enter input plugin property: "referenceName" with value: "bqInvalidReferenceName"
72+
Then Enter BigQuery property projectId "projectId"
73+
Then Enter BigQuery property datasetProjectId "projectId"
74+
Then Override Service account details if set in environment variables
75+
Then Enter BigQuery property dataset "dataset"
76+
Then Enter BigQuery sink property table name
77+
Then Click on the Validate button
78+
Then Verify that the Plugin Property: "referenceName" is displaying an in-line error message: "errorMessageIncorrectReferenceName"

src/e2e-test/features/bigquery/sink/GCSToBigQuery_WithMacro.feature renamed to src/e2e-test/features/bigquery/sink/BigQuerySink_WithMacro.feature

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,3 +77,90 @@ Feature: BigQuery sink - Verification of GCS to BigQuery successful data transfe
7777
Then Verify the pipeline status is "Succeeded"
7878
Then Get count of no of records transferred to target BigQuery Table
7979
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled
80+
81+
@BQ_INSERT_INT_SOURCE_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
82+
Scenario:Validate successful records transfer from BigQuery to BigQuery with macro arguments for Advanced and Auto Create sections
83+
Given Open Datafusion Project to configure pipeline
84+
When Source is BigQuery
85+
When Sink is BigQuery
86+
Then Open BigQuery source properties
87+
Then Enter the BigQuery source mandatory properties
88+
Then Validate "BigQuery" plugin properties
89+
Then Close the BigQuery properties
90+
Then Open BigQuery sink properties
91+
Then Enter BigQuery property reference name
92+
Then Enter BigQuery property projectId "projectId"
93+
Then Enter BigQuery property datasetProjectId "projectId"
94+
Then Override Service account details if set in environment variables
95+
Then Enter BigQuery property dataset "dataset"
96+
Then Enter BigQuery sink property table name
97+
Then Enter BiqQuery property encryption key name "cmekBQ" if cmek is enabled
98+
Then Toggle BigQuery sink property truncateTable to true
99+
Then Toggle BigQuery sink property updateTableSchema to true
100+
Then Click on the Macro button of Property: "operation" and set the value to: "BqOperationType"
101+
Then Click on the Macro button of Property: "relationTableKey" and set the value to: "tableKey"
102+
Then Click on the Macro button of Property: "partitioningType" and set the value to: "BqPartioningType"
103+
Then Click on the Macro button of Property: "rangeStart" and set the value to: "BqRangeStart"
104+
Then Click on the Macro button of Property: "rangeEnd" and set the value to: "BqRangeEnd"
105+
Then Click on the Macro button of Property: "rangeInterval" and set the value to: "BqRangeInterval"
106+
Then Click on the Macro button of Property: "partitionByField" and set the value to: "BqPartitionByField"
107+
Then Click on the Macro button of Property: "clusteringOrder" and set the value to: "BqClusteringOrder"
108+
Then Validate "BigQuery2" plugin properties
109+
Then Close the BigQuery properties
110+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
111+
Then Save the pipeline
112+
Then Deploy the pipeline
113+
Then Click on the Runtime Arguments Dropdown button
114+
Then Enter runtime argument value "bqOperationType" for key "BqOperationType"
115+
Then Enter runtime argument value "TableKey" for key "tableKey"
116+
Then Enter runtime argument value "bqPartioningType" for key "BqPartioningType"
117+
Then Enter runtime argument value "rangeStartValue" for key "BqRangeStart"
118+
Then Enter runtime argument value "rangeEndValue" for key "BqRangeEnd"
119+
Then Enter runtime argument value "rangeIntervalValue" for key "BqRangeInterval"
120+
Then Enter runtime argument value "partitionByFieldValue" for key "BqPartitionByField"
121+
Then Enter runtime argument value "BqclusterValue" for key "BqClusteringOrder"
122+
Then Run the Pipeline in Runtime with runtime arguments
123+
Then Wait till pipeline is in running state
124+
Then Open and capture logs
125+
Then Verify the pipeline status is "Succeeded"
126+
Then Get count of no of records transferred to target BigQuery Table
127+
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled
128+
129+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
130+
Scenario:Validate successful records transfer from BigQuery to BigQuery with macro arguments for partition field and partition filter
131+
Given Open Datafusion Project to configure pipeline
132+
When Source is BigQuery
133+
When Sink is BigQuery
134+
Then Open BigQuery source properties
135+
Then Enter the BigQuery source mandatory properties
136+
Then Validate "BigQuery" plugin properties
137+
Then Close the BigQuery properties
138+
Then Open BigQuery sink properties
139+
Then Enter BigQuery property reference name
140+
Then Enter BigQuery property projectId "projectId"
141+
Then Enter BigQuery property datasetProjectId "projectId"
142+
Then Override Service account details if set in environment variables
143+
Then Enter BigQuery property dataset "dataset"
144+
Then Enter BigQuery sink property table name
145+
Then Enter BiqQuery property encryption key name "cmekBQ" if cmek is enabled
146+
Then Toggle BigQuery sink property truncateTable to true
147+
Then Toggle BigQuery sink property updateTableSchema to true
148+
And Select radio button plugin property: "operation" with value: "upsert"
149+
Then Click on the Add Button of the property: "relationTableKey" with value:
150+
| bqTableKey |
151+
Then Click on the Macro button of Property: "partitionByField" and set the value to: "BqPartitionByField"
152+
Then Click on the Macro button of Property: "partitionFilter" and set the value to: "BqPartitionFilter"
153+
Then Validate "BigQuery2" plugin properties
154+
Then Close the BigQuery properties
155+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
156+
Then Save the pipeline
157+
Then Deploy the pipeline
158+
Then Click on the Runtime Arguments Dropdown button
159+
Then Enter runtime argument value "bqPartitionFieldTime" for key "BqPartitionByField"
160+
Then Enter runtime argument value "bqPartitionFilter" for key "BqPartitionFilter"
161+
Then Run the Pipeline in Runtime with runtime arguments
162+
Then Wait till pipeline is in running state
163+
Then Open and capture logs
164+
Then Verify the pipeline status is "Succeeded"
165+
Then Get count of no of records transferred to target BigQuery Table
166+
Then Validate the cmek key "cmekBQ" of target BigQuery table if cmek is enabled

src/e2e-test/features/bigquery/sink/BigQueryToBigQuerySink.feature

Lines changed: 151 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -345,3 +345,154 @@ Feature: BigQuery sink - Verification of BigQuery to BigQuery successful data tr
345345
Then Close the pipeline logs
346346
Then Verify the pipeline status is "Succeeded"
347347
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
348+
349+
@BQ_INSERT_SOURCE_TEST @BQ_SINK_TEST @EXISTING_BQ_CONNECTION @BigQuery_Sink_Required @ITN_TEST
350+
Scenario Outline:Validate successful records transfer from BigQuery to BigQuery with different time partioning type options
351+
Given Open Datafusion Project to configure pipeline
352+
When Expand Plugin group in the LHS plugins list: "Source"
353+
When Select plugin: "BigQuery" from the plugins list as: "Source"
354+
When Expand Plugin group in the LHS plugins list: "Sink"
355+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
356+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
357+
Then Navigate to the properties page of plugin: "BigQuery"
358+
Then Click plugin property: "switch-useConnection"
359+
Then Click on the Browse Connections button
360+
Then Select connection: "bqConnectionName"
361+
Then Click on the Browse button inside plugin properties
362+
Then Select connection data row with name: "dataset"
363+
Then Select connection data row with name: "bqSourceTable"
364+
Then Wait till connection data loading completes with a timeout of 60 seconds
365+
Then Verify input plugin property: "dataset" contains value: "dataset"
366+
Then Verify input plugin property: "table" contains value: "bqSourceTable"
367+
Then Click on the Get Schema button
368+
Then Validate "BigQuery" plugin properties
369+
And Close the Plugin Properties page
370+
Then Navigate to the properties page of plugin: "BigQuery2"
371+
Then Click plugin property: "useConnection"
372+
Then Click on the Browse Connections button
373+
Then Select connection: "bqConnectionName"
374+
Then Enter input plugin property: "referenceName" with value: "BQSinkReferenceName"
375+
Then Click on the Browse button inside plugin properties
376+
Then Click SELECT button inside connection data row with name: "dataset"
377+
Then Wait till connection data loading completes with a timeout of 60 seconds
378+
Then Verify input plugin property: "dataset" contains value: "dataset"
379+
Then Enter input plugin property: "table" with value: "bqTargetTable"
380+
And Select radio button plugin property: "operation" with value: "upsert"
381+
And Select radio button plugin property: "timePartitioningType" with value: "<options>"
382+
Then Click on the Add Button of the property: "relationTableKey" with value:
383+
| TableKey |
384+
Then Validate "BigQuery" plugin properties
385+
And Close the Plugin Properties page
386+
Then Save the pipeline
387+
Then Preview and run the pipeline
388+
Then Wait till pipeline preview is in running state
389+
Then Open and capture pipeline preview logs
390+
Then Verify the preview run status of pipeline in the logs is "succeeded"
391+
Then Close the pipeline logs
392+
Then Close the preview
393+
Then Deploy the pipeline
394+
Then Run the Pipeline in Runtime
395+
Then Wait till pipeline is in running state
396+
Then Open and capture logs
397+
Then Close the pipeline logs
398+
Then Verify the pipeline status is "Succeeded"
399+
Then Validate the values of records transferred to BQ sink is equal to the values from source BigQuery table
400+
Examples:
401+
| options |
402+
| DAY |
403+
| HOUR |
404+
| MONTH |
405+
| YEAR |
406+
407+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
408+
Scenario:Validate successful records transfer from BigQuery to BigQuery with BQ Job Labels with Key and Value pairs
409+
Given Open Datafusion Project to configure pipeline
410+
When Expand Plugin group in the LHS plugins list: "Source"
411+
When Select plugin: "BigQuery" from the plugins list as: "Source"
412+
When Expand Plugin group in the LHS plugins list: "Sink"
413+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
414+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
415+
Then Navigate to the properties page of plugin: "BigQuery"
416+
And Enter input plugin property: "referenceName" with value: "Reference"
417+
And Replace input plugin property: "project" with value: "projectId"
418+
And Enter input plugin property: "datasetProject" with value: "projectId"
419+
And Replace input plugin property: "dataset" with value: "dataset"
420+
Then Override Service account details if set in environment variables
421+
And Enter input plugin property: "table" with value: "bqSourceTable"
422+
Then Click on the Get Schema button
423+
Then Validate "BigQuery" plugin properties
424+
And Close the Plugin Properties page
425+
Then Navigate to the properties page of plugin: "BigQuery2"
426+
Then Replace input plugin property: "project" with value: "projectId"
427+
Then Override Service account details if set in environment variables
428+
Then Enter input plugin property: "datasetProject" with value: "projectId"
429+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
430+
Then Enter input plugin property: "dataset" with value: "dataset"
431+
Then Enter input plugin property: "table" with value: "bqTargetTable"
432+
Then Click plugin property: "truncateTable"
433+
Then Click plugin property: "updateTableSchema"
434+
Then Click on the Add Button of the property: "jobLabels" with value:
435+
| jobLabelKey |
436+
Then Click on the Add Button of the property: "jobLabels" with value:
437+
| jobLabelValue |
438+
Then Enter BigQuery sink property partition field "bqPartitionFieldTime"
439+
Then Validate "BigQuery" plugin properties
440+
Then Close the BigQuery properties
441+
Then Save the pipeline
442+
Then Preview and run the pipeline
443+
Then Wait till pipeline preview is in running state
444+
Then Open and capture pipeline preview logs
445+
Then Verify the preview run status of pipeline in the logs is "succeeded"
446+
Then Close the pipeline logs
447+
Then Close the preview
448+
Then Deploy the pipeline
449+
Then Run the Pipeline in Runtime
450+
Then Wait till pipeline is in running state
451+
Then Open and capture logs
452+
Then Verify the pipeline status is "Succeeded"
453+
454+
@BQ_SOURCE_DATATYPE_TEST @BQ_SINK_TEST
455+
Scenario:Validate successful records transfer from BigQuery to BigQuery with Partition Filter
456+
Given Open Datafusion Project to configure pipeline
457+
When Expand Plugin group in the LHS plugins list: "Source"
458+
When Select plugin: "BigQuery" from the plugins list as: "Source"
459+
When Expand Plugin group in the LHS plugins list: "Sink"
460+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
461+
Then Connect plugins: "BigQuery" and "BigQuery2" to establish connection
462+
Then Navigate to the properties page of plugin: "BigQuery"
463+
And Enter input plugin property: "referenceName" with value: "Reference"
464+
And Replace input plugin property: "project" with value: "projectId"
465+
And Enter input plugin property: "datasetProject" with value: "projectId"
466+
And Replace input plugin property: "dataset" with value: "dataset"
467+
Then Override Service account details if set in environment variables
468+
And Enter input plugin property: "table" with value: "bqSourceTable"
469+
Then Click on the Get Schema button
470+
Then Validate "BigQuery" plugin properties
471+
And Close the Plugin Properties page
472+
Then Navigate to the properties page of plugin: "BigQuery2"
473+
Then Replace input plugin property: "project" with value: "projectId"
474+
Then Override Service account details if set in environment variables
475+
Then Enter input plugin property: "datasetProject" with value: "projectId"
476+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
477+
Then Enter input plugin property: "dataset" with value: "dataset"
478+
Then Enter input plugin property: "table" with value: "bqTargetTable"
479+
Then Click plugin property: "truncateTable"
480+
Then Click plugin property: "updateTableSchema"
481+
And Select radio button plugin property: "operation" with value: "upsert"
482+
Then Click on the Add Button of the property: "relationTableKey" with value:
483+
| bqTableKey |
484+
Then Enter input plugin property: "partitionFilter" with value: "bqPartitionFilter"
485+
Then Validate "BigQuery" plugin properties
486+
Then Close the BigQuery properties
487+
Then Save the pipeline
488+
Then Preview and run the pipeline
489+
Then Wait till pipeline preview is in running state
490+
Then Open and capture pipeline preview logs
491+
Then Verify the preview run status of pipeline in the logs is "succeeded"
492+
Then Close the pipeline logs
493+
Then Close the preview
494+
Then Deploy the pipeline
495+
Then Run the Pipeline in Runtime
496+
Then Wait till pipeline is in running state
497+
Then Open and capture logs
498+
Then Verify the pipeline status is "Succeeded"

src/e2e-test/resources/pluginParameters.properties

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -188,6 +188,9 @@ bqSourceSchema=[{"key":"Id","value":"long"},{"key":"Value","value":"long"},{"key
188188
bqPartitionSourceSchema=[{"key":"transaction_id","value":"long"},{"key":"transaction_uid","value":"string"},\
189189
{"key":"transaction_date","value":"date"}]
190190
bqMandatoryProperties=referenceName, dataset, table
191+
jobLabelKey=transaction_uid
192+
jobLabelValue=transaction_uid:redis
193+
jsonStringValue=transaction_uid
191194
bqIncorrectProjectId=incorrectprojectid
192195
bqIncorrectDatasetProjectId=incorrectdatasetprojectid
193196
bqIncorrectFormatProjectId=INCORRECTFORMAT
@@ -201,7 +204,10 @@ bqFuturePartitionEndDate=2099-10-02
201204
bqTruncateTableTrue=True
202205
bqUpdateTableSchemaTrue=True
203206
clusterValue=transaction_date
207+
BqclusterValue=Name
204208
TableKey=PersonID
209+
bqPartioningType=INTEGER
210+
bqPartitionFilter=transaction_uid
205211
bqSourceTable=dummy
206212
bqCreateTableQueryFile=testdata/BigQuery/BigQueryCreateTableQuery.txt
207213
bqInsertDataQueryFile=testdata/BigQuery/BigQueryInsertDataQuery.txt
@@ -215,6 +221,7 @@ bqSourceSchemaDatatype=[{"key":"transaction_info","value":"boolean"},{"key":"tra
215221
{"key":"difference","value":"array"},{"key":"Userdata","value":"record"}]
216222
bqPartitionField=Month_of_Joining
217223
bqPartitionFieldTime=transaction_date
224+
bqTableKey=unique_key
218225
bqRangeStart=1
219226
bqRangeEnd=10
220227
bqRangeInterval=2
@@ -245,6 +252,7 @@ rangeIntervalValue=1
245252
partitionByFieldValue=ID
246253
bqPartitionFieldDateTime=transaction_dt
247254
bqPartitionFieldTimeStamp=updated_on
255+
bqOperationType=Insert
248256
bqSourceTable2=dummy
249257
dedupeBy=DESC
250258
TableKeyDedupe=Name

0 commit comments

Comments
 (0)