@@ -95,7 +95,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
9595 | parquet | application /octet -stream |
9696 | orc | application /octet -stream |
9797
98- @GCS_SINK_TEST @BQ_SOURCE_TEST
98+ @BQ_SOURCE_TEST @GCS_SINK_TEST
9999 Scenario Outline : To verify data is getting transferred successfully from BigQuery to GCS with combinations of contenttype
100100 Given Open Datafusion Project to configure pipeline
101101 When Source is BigQuery
@@ -265,3 +265,81 @@ Feature: GCS sink - Verification of GCS Sink plugin
265265 Then Open and capture logs
266266 Then Verify the pipeline status is "Succeeded"
267267 Then Verify data is transferred to target GCS bucket
268+
269+ @BQ_SOURCE_TEST @GCS_SINK_TEST
270+ Scenario Outline : To verify data is getting transferred successfully from BigQuery to GCS with contenttype selection
271+ Given Open Datafusion Project to configure pipeline
272+ When Select plugin: "BigQuery" from the plugins list as: "Source"
273+ When Expand Plugin group in the LHS plugins list: "Sink"
274+ When Select plugin: "GCS" from the plugins list as: "Sink"
275+ Then Connect source as "BigQuery" and sink as "GCS" to establish connection
276+ Then Open BigQuery source properties
277+ Then Enter the BigQuery source mandatory properties
278+ Then Validate "BigQuery" plugin properties
279+ Then Close the BigQuery properties
280+ Then Open GCS sink properties
281+ Then Enter GCS property projectId and reference name
282+ Then Enter GCS sink property path
283+ Then Select GCS property format "<FileFormat>"
284+ Then Select GCS sink property contentType "<contentType>"
285+ Then Enter GCS File system properties field "gcsCSVFileSysProperty"
286+ Then Validate "GCS" plugin properties
287+ Then Close the GCS properties
288+ Then Save and Deploy Pipeline
289+ Then Run the Pipeline in Runtime
290+ Then Wait till pipeline is in running state
291+ Then Open and capture logs
292+ Then Verify the pipeline status is "Succeeded"
293+ Then Verify data is transferred to target GCS bucket
294+ Examples :
295+ | FileFormat | contentType |
296+ | csv | text /csv |
297+ | tsv | text /plain |
298+
299+ @GCS_AVRO_FILE @GCS_SINK_TEST
300+ Scenario Outline : To verify data transferred successfully from GCS Source to GCS Sink with write header true at Sink
301+ Given Open Datafusion Project to configure pipeline
302+ When Select plugin: "GCS" from the plugins list as: "Source"
303+ When Expand Plugin group in the LHS plugins list: "Sink"
304+ When Select plugin: "GCS" from the plugins list as: "Sink"
305+ Then Connect plugins: "GCS" and "GCS2" to establish connection
306+ Then Navigate to the properties page of plugin: "GCS"
307+ Then Replace input plugin property: "project" with value: "projectId"
308+ Then Override Service account details if set in environment variables
309+ Then Enter input plugin property: "referenceName" with value: "sourceRef"
310+ Then Enter GCS source property path "gcsAvroAllDataFile"
311+ Then Select GCS property format "avro"
312+ Then Click on the Get Schema button
313+ Then Verify the Output Schema matches the Expected Schema: "gcsAvroAllTypeDataSchema"
314+ Then Validate "GCS" plugin properties
315+ Then Close the Plugin Properties page
316+ Then Navigate to the properties page of plugin: "GCS2"
317+ Then Enter GCS property projectId and reference name
318+ Then Enter GCS sink property path
319+ Then Select GCS property format "<FileFormat>"
320+ Then Click on the Macro button of Property: "writeHeader" and set the value to: "WriteHeader"
321+ Then Validate "GCS" plugin properties
322+ Then Close the GCS properties
323+ Then Save the pipeline
324+ Then Preview and run the pipeline
325+ Then Enter runtime argument value "writeHeader" for key "WriteHeader"
326+ Then Run the preview of pipeline with runtime arguments
327+ Then Wait till pipeline preview is in running state
328+ Then Open and capture pipeline preview logs
329+ Then Verify the preview run status of pipeline in the logs is "succeeded"
330+ Then Close the pipeline logs
331+ Then Close the preview
332+ Then Deploy the pipeline
333+ Then Run the Pipeline in Runtime
334+ Then Enter runtime argument value "writeHeader" for key "WriteHeader"
335+ Then Run the Pipeline in Runtime with runtime arguments
336+ Then Wait till pipeline is in running state
337+ Then Open and capture logs
338+ Then Verify the pipeline status is "Succeeded"
339+ Then Verify data is transferred to target GCS bucket
340+ Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket
341+ Examples :
342+ | FileFormat |
343+ | csv |
344+ | tsv |
345+ | delimited |
0 commit comments