@@ -95,7 +95,7 @@ Feature: GCS sink - Verification of GCS Sink plugin
9595 | parquet | application /octet -stream |
9696 | orc | application /octet -stream |
9797
98- @GCS_SINK_TEST @BQ_SOURCE_TEST
98+ @BQ_SOURCE_TEST @GCS_SINK_TEST
9999 Scenario Outline : To verify data is getting transferred successfully from BigQuery to GCS with combinations of contenttype
100100 Given Open Datafusion Project to configure pipeline
101101 When Source is BigQuery
@@ -265,3 +265,120 @@ Feature: GCS sink - Verification of GCS Sink plugin
265265 Then Open and capture logs
266266 Then Verify the pipeline status is "Succeeded"
267267 Then Verify data is transferred to target GCS bucket
268+
269+ @BQ_SOURCE_TEST @GCS_SINK_TEST
270+ Scenario Outline : To verify data is getting transferred successfully from BigQuery to GCS with contenttype selection
271+ Given Open Datafusion Project to configure pipeline
272+ When Select plugin: "BigQuery" from the plugins list as: "Source"
273+ When Expand Plugin group in the LHS plugins list: "Sink"
274+ When Select plugin: "GCS" from the plugins list as: "Sink"
275+ Then Connect source as "BigQuery" and sink as "GCS" to establish connection
276+ Then Open BigQuery source properties
277+ Then Enter the BigQuery source mandatory properties
278+ Then Validate "BigQuery" plugin properties
279+ Then Close the BigQuery properties
280+ Then Open GCS sink properties
281+ Then Enter GCS property projectId and reference name
282+ Then Enter GCS sink property path
283+ Then Select GCS property format "<FileFormat>"
284+ Then Select GCS sink property contentType "<contentType>"
285+ Then Validate "GCS" plugin properties
286+ Then Close the GCS properties
287+ Then Save and Deploy Pipeline
288+ Then Run the Pipeline in Runtime
289+ Then Wait till pipeline is in running state
290+ Then Open and capture logs
291+ Then Verify the pipeline status is "Succeeded"
292+ Then Verify data is transferred to target GCS bucket
293+ Examples :
294+ | FileFormat | contentType |
295+ | csv | text /csv |
296+ | tsv | text /plain |
297+
298+ @BQ_SOURCE_DATATYPE_TEST @GCS_SINK_TEST
299+ Scenario :Validate successful records transfer from BigQuery to GCS with advanced file system properties field
300+ Given Open Datafusion Project to configure pipeline
301+ Then Select plugin: "BigQuery" from the plugins list as: "Source"
302+ When Expand Plugin group in the LHS plugins list: "Sink"
303+ When Select plugin: "GCS" from the plugins list as: "Sink"
304+ Then Open BigQuery source properties
305+ Then Enter BigQuery property reference name
306+ Then Enter BigQuery property projectId "projectId"
307+ Then Enter BigQuery property datasetProjectId "projectId"
308+ Then Override Service account details if set in environment variables
309+ Then Enter BigQuery property dataset "dataset"
310+ Then Enter BigQuery source property table name
311+ Then Validate output schema with expectedSchema "bqSourceSchemaDatatype"
312+ Then Validate "BigQuery" plugin properties
313+ Then Close the BigQuery properties
314+ Then Open GCS sink properties
315+ Then Override Service account details if set in environment variables
316+ Then Enter the GCS sink mandatory properties
317+ Then Enter GCS File system properties field "gcsCSVFileSysProperty"
318+ Then Validate "GCS" plugin properties
319+ Then Close the GCS properties
320+ Then Connect source as "BigQuery" and sink as "GCS" to establish connection
321+ Then Save the pipeline
322+ Then Preview and run the pipeline
323+ Then Wait till pipeline preview is in running state
324+ Then Open and capture pipeline preview logs
325+ Then Verify the preview run status of pipeline in the logs is "succeeded"
326+ Then Close the pipeline logs
327+ Then Click on preview data for GCS sink
328+ Then Verify preview output schema matches the outputSchema captured in properties
329+ Then Close the preview data
330+ Then Deploy the pipeline
331+ Then Run the Pipeline in Runtime
332+ Then Wait till pipeline is in running state
333+ Then Open and capture logs
334+ Then Verify the pipeline status is "Succeeded"
335+ Then Verify data is transferred to target GCS bucket
336+ Then Validate the values of records transferred to GCS bucket is equal to the values from source BigQuery table
337+
338+ @GCS_AVRO_FILE @GCS_SINK_TEST @GCS_Source_Required
339+ Scenario Outline : To verify data transferred successfully from GCS Source to GCS Sink with write header true at Sink
340+ Given Open Datafusion Project to configure pipeline
341+ When Select plugin: "GCS" from the plugins list as: "Source"
342+ When Expand Plugin group in the LHS plugins list: "Sink"
343+ When Select plugin: "GCS" from the plugins list as: "Sink"
344+ Then Connect plugins: "GCS" and "GCS2" to establish connection
345+ Then Navigate to the properties page of plugin: "GCS"
346+ Then Replace input plugin property: "project" with value: "projectId"
347+ Then Override Service account details if set in environment variables
348+ Then Enter input plugin property: "referenceName" with value: "sourceRef"
349+ Then Enter GCS source property path "gcsAvroAllDataFile"
350+ Then Select GCS property format "avro"
351+ Then Click on the Get Schema button
352+ Then Verify the Output Schema matches the Expected Schema: "gcsAvroAllTypeDataSchema"
353+ Then Validate "GCS" plugin properties
354+ Then Close the Plugin Properties page
355+ Then Navigate to the properties page of plugin: "GCS2"
356+ Then Enter GCS property projectId and reference name
357+ Then Enter GCS sink property path
358+ Then Select GCS property format "<FileFormat>"
359+ Then Click on the Macro button of Property: "writeHeader" and set the value to: "WriteHeader"
360+ Then Validate "GCS" plugin properties
361+ Then Close the GCS properties
362+ Then Save the pipeline
363+ Then Preview and run the pipeline
364+ Then Enter runtime argument value "writeHeader" for key "WriteHeader"
365+ Then Run the preview of pipeline with runtime arguments
366+ Then Wait till pipeline preview is in running state
367+ Then Open and capture pipeline preview logs
368+ Then Verify the preview run status of pipeline in the logs is "succeeded"
369+ Then Close the pipeline logs
370+ Then Close the preview
371+ Then Deploy the pipeline
372+ Then Run the Pipeline in Runtime
373+ Then Enter runtime argument value "writeHeader" for key "WriteHeader"
374+ Then Run the Pipeline in Runtime with runtime arguments
375+ Then Wait till pipeline is in running state
376+ Then Open and capture logs
377+ Then Verify the pipeline status is "Succeeded"
378+ Then Verify data is transferred to target GCS bucket
379+ Then Validate the data from GCS Source to GCS Sink with expected csv file and target data in GCS bucket
380+ Examples :
381+ | FileFormat |
382+ | csv |
383+ | tsv |
384+ | delimited |
0 commit comments