@@ -285,3 +285,268 @@ Feature: GCS source - Verification of GCS to BQ successful data transfer
285285 Then Verify the pipeline status is "Succeeded"
286286 Then Get count of no of records transferred to target BigQuery Table
287287 Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
288+
289+ @GCS_TSV_TEST @BQ_SINK_TEST
290+ Scenario : To verify successful data transfer from GCS source to BigQuery sink using tsv file format
291+ Given Open Datafusion Project to configure pipeline
292+ When Select plugin: "GCS" from the plugins list as: "Source"
293+ When Expand Plugin group in the LHS plugins list: "Sink"
294+ When Select plugin: "BigQuery" from the plugins list as: "Sink"
295+ Then Connect source as "GCS" and sink as "BigQuery" to establish connection
296+ Then Open GCS source properties
297+ Then Enter GCS property projectId and reference name
298+ Then Override Service account details if set in environment variables
299+ Then Enter GCS source property path "gcsTsvFile"
300+ Then Select GCS property format "tsv"
301+ Then Toggle GCS source property skip header to true
302+ Then Validate output schema with expectedSchema "gcsTsvFileSchema"
303+ Then Validate "GCS" plugin properties
304+ Then Close the GCS properties
305+ Then Open BigQuery sink properties
306+ Then Override Service account details if set in environment variables
307+ Then Enter the BigQuery sink mandatory properties
308+ Then Validate "BigQuery" plugin properties
309+ Then Close the BigQuery properties
310+ Then Save the pipeline
311+ Then Preview and run the pipeline
312+ Then Wait till pipeline preview is in running state
313+ Then Open and capture pipeline preview logs
314+ Then Verify the preview run status of pipeline in the logs is "succeeded"
315+ Then Close the pipeline logs
316+ Then Click on preview data for BigQuery sink
317+ Then Verify preview output schema matches the outputSchema captured in properties
318+ Then Close the preview data
319+ Then Deploy the pipeline
320+ Then Run the Pipeline in Runtime
321+ Then Wait till pipeline is in running state
322+ Then Open and capture logs
323+ Then Verify the pipeline status is "Succeeded"
324+ Then Get count of no of records transferred to target BigQuery Table
325+ Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
326+
327+ @GCS_PARQUET_TEST @BQ_SINK_TEST
328+ Scenario : To verify successful data transfer from GCS source to BigQuery sink using parquet file format
329+ Given Open Datafusion Project to configure pipeline
330+ When Select plugin: "GCS" from the plugins list as: "Source"
331+ When Expand Plugin group in the LHS plugins list: "Sink"
332+ When Select plugin: "BigQuery" from the plugins list as: "Sink"
333+ Then Connect source as "GCS" and sink as "BigQuery" to establish connection
334+ Then Open GCS source properties
335+ Then Enter GCS property projectId and reference name
336+ Then Override Service account details if set in environment variables
337+ Then Enter GCS source property path "gcsParquetFile"
338+ Then Select GCS property format "parquet"
339+ Then Validate output schema with expectedSchema "gcsParquetFileSchema"
340+ Then Validate "GCS" plugin properties
341+ Then Close the GCS properties
342+ Then Open BigQuery sink properties
343+ Then Override Service account details if set in environment variables
344+ Then Enter the BigQuery sink mandatory properties
345+ Then Validate "BigQuery" plugin properties
346+ Then Close the BigQuery properties
347+ Then Save the pipeline
348+ Then Preview and run the pipeline
349+ Then Wait till pipeline preview is in running state
350+ Then Open and capture pipeline preview logs
351+ Then Verify the preview run status of pipeline in the logs is "succeeded"
352+ Then Close the pipeline logs
353+ Then Click on preview data for BigQuery sink
354+ Then Verify preview output schema matches the outputSchema captured in properties
355+ Then Close the preview data
356+ Then Deploy the pipeline
357+ Then Run the Pipeline in Runtime
358+ Then Wait till pipeline is in running state
359+ Then Open and capture logs
360+ Then Verify the pipeline status is "Succeeded"
361+ Then Get count of no of records transferred to target BigQuery Table
362+ Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
363+
364+ @GCS_JSON_TEST @BQ_SINK_TEST
365+ Scenario : To verify successful data transfer from GCS source to BigQuery sink using json file format
366+ Given Open Datafusion Project to configure pipeline
367+ When Select plugin: "GCS" from the plugins list as: "Source"
368+ When Expand Plugin group in the LHS plugins list: "Sink"
369+ When Select plugin: "BigQuery" from the plugins list as: "Sink"
370+ Then Connect source as "GCS" and sink as "BigQuery" to establish connection
371+ Then Open GCS source properties
372+ Then Enter GCS property projectId and reference name
373+ Then Override Service account details if set in environment variables
374+ Then Enter GCS source property path "gcsJsonFile"
375+ Then Select GCS property format "json"
376+ Then Enter GCS source property output schema "outputSchema" as macro argument "OutSchema"
377+ Then Validate "GCS" plugin properties
378+ Then Close the GCS properties
379+ Then Open BigQuery sink properties
380+ Then Override Service account details if set in environment variables
381+ Then Enter the BigQuery sink mandatory properties
382+ Then Validate "BigQuery" plugin properties
383+ Then Close the BigQuery properties
384+ Then Save the pipeline
385+ Then Preview and run the pipeline
386+ Then Enter runtime argument value "gcsJsonFileSchema" for key "OutSchema"
387+ Then Run the preview of pipeline with runtime arguments
388+ Then Wait till pipeline preview is in running state
389+ Then Open and capture pipeline preview logs
390+ Then Verify the preview run status of pipeline in the logs is "succeeded"
391+ Then Close the pipeline logs
392+ Then Close the preview
393+ Then Deploy the pipeline
394+ Then Run the Pipeline in Runtime
395+ Then Enter runtime argument value "gcsJsonFileSchema" for key "OutSchema"
396+ Then Run the Pipeline in Runtime with runtime arguments
397+ Then Wait till pipeline is in running state
398+ Then Open and capture logs
399+ Then Verify the pipeline status is "Succeeded"
400+ Then Get count of no of records transferred to target BigQuery Table
401+ Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
402+
403+ @GCS_CSV_TEST @BQ_SINK_TEST
404+ Scenario : To verify Successful GCS to BigQuery data transfer with enable data file encryption flag true
405+ Given Open Datafusion Project to configure pipeline
406+ When Select plugin: "GCS" from the plugins list as: "Source"
407+ When Expand Plugin group in the LHS plugins list: "Sink"
408+ When Select plugin: "BigQuery" from the plugins list as: "Sink"
409+ Then Connect source as "GCS" and sink as "BigQuery" to establish connection
410+ Then Open GCS source properties
411+ Then Enter GCS property projectId and reference name
412+ Then Override Service account details if set in environment variables
413+ Then Enter GCS source property path "gcsCsvFile"
414+ Then Select GCS property format "csv"
415+ Then Toggle GCS source property skip header to true
416+ Then Validate output schema with expectedSchema "gcsCsvFileSchema"
417+ Then Validate "GCS" plugin properties
418+ Then Select radio button plugin property: "encrypted" with value: "true"
419+ Then Close the GCS properties
420+ Then Open BigQuery sink properties
421+ Then Override Service account details if set in environment variables
422+ Then Enter the BigQuery sink mandatory properties
423+ Then Validate "BigQuery" plugin properties
424+ Then Close the BigQuery properties
425+ Then Save the pipeline
426+ Then Preview and run the pipeline
427+ Then Wait till pipeline preview is in running state
428+ Then Open and capture pipeline preview logs
429+ Then Verify the preview run status of pipeline in the logs is "succeeded"
430+ Then Close the pipeline logs
431+ Then Close the preview
432+ Then Deploy the pipeline
433+ Then Run the Pipeline in Runtime
434+ Then Wait till pipeline is in running state
435+ Then Open and capture logs
436+ Then Verify the pipeline status is "Succeeded"
437+ Then Get count of no of records transferred to target BigQuery Table
438+ Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
439+
440+ @GCS_CSV_TEST @BQ_SINK_TEST @BigQuery_Sink_Required
441+ Scenario :To verify successful records transfer from GCS source to BigQuery sink with macro fields enabled at source
442+ Given Open Datafusion Project to configure pipeline
443+ When Select plugin: "GCS" from the plugins list as: "Source"
444+ When Expand Plugin group in the LHS plugins list: "Sink"
445+ When Select plugin: "BigQuery" from the plugins list as: "Sink"
446+ Then Open GCS source properties
447+ Then Enter GCS property reference name
448+ Then Enter GCS property "projectId" as macro argument "gcsProjectId"
449+ Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
450+ Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount"
451+ Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount"
452+ Then Enter GCS property "path" as macro argument "gcsSourcePath"
453+ Then Enter GCS property "format" as macro argument "gcsFormat"
454+ Then Enter GCS source property "skipHeader" as macro argument "gcsSkipHeader"
455+ Then Click on the Macro button of Property: "sampleSize" and set the value to: "SampleSize"
456+ Then Click on the Macro button of Property: "override" and set the value to: "OverRide"
457+ Then Click on the Macro button of Property: "minSplitSize" and set the value to: "MinSplit"
458+ Then Click on the Macro button of Property: "maxSplitSize" and set the value to: "MaxSplit"
459+ Then Click on the Macro button of Property: "fileRegex" and set the value to: "FileReg"
460+ Then Click on the Macro button of Property: "pathField" and set the value to: "PathF"
461+ Then Click on the Macro button of Property: "filenameOnly" and set the value to: "FilenameOnly"
462+ Then Click on the Macro button of Property: "recursive" and set the value to: "ReadFilesRecursively"
463+ Then Click on the Macro button of Property: "ignoreNonExistingFolders" and set the value to: "IgnoreNonExistingFolders"
464+ Then Click on the Macro button of Property: "encrypted" and set the value to: "DataFileEncrypted"
465+ Then Click on the Macro button of Property: "encryptedMetadataSuffix" and set the value to: "testmeta"
466+ Then Click on the Macro button of Property: "fileSystemProperties" and set the value to: "FileSystemPr"
467+ Then Click on the Macro button of Property: "fileEncoding" and set the value to: "Encode"
468+ Then Enter GCS source property output schema "outputSchema" as macro argument "gcsOutputSchema"
469+ Then Validate "GCS" plugin properties
470+ Then Close the GCS properties
471+ Then Open BigQuery sink properties
472+ Then Enter BigQuery property reference name
473+ Then Enter BigQuery property "projectId" as macro argument "bqProjectId"
474+ Then Enter BigQuery property "datasetProjectId" as macro argument "bqDatasetProjectId"
475+ Then Enter GCS property "serviceAccountType" as macro argument "serviceAccountType"
476+ Then Enter GCS property "serviceAccountFilePath" as macro argument "serviceAccount"
477+ Then Enter GCS property "serviceAccountJSON" as macro argument "serviceAccount"
478+ Then Enter BigQuery property "dataset" as macro argument "bqDataset"
479+ Then Enter BigQuery property "table" as macro argument "bqTargetTable"
480+ Then Enter BigQuery sink property "truncateTable" as macro argument "bqTruncateTable"
481+ Then Enter BigQuery sink property "updateTableSchema" as macro argument "bqUpdateTableSchema"
482+ Then Validate "BigQuery" plugin properties
483+ Then Close the BigQuery properties
484+ Then Connect source as "GCS" and sink as "BigQuery" to establish connection
485+ Then Save the pipeline
486+ Then Preview and run the pipeline
487+ Then Enter runtime argument value "projectId" for key "gcsProjectId"
488+ Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
489+ Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
490+ Then Enter runtime argument value "gcsCsvFile" for GCS source property path key "gcsSourcePath"
491+ Then Enter runtime argument value "gcsSkipHeaderTrue" for key "gcsSkipHeader"
492+ Then Enter runtime argument value "csvFormat" for key "gcsFormat"
493+ Then Enter runtime argument value "sampleSize" for key "SampleSize"
494+ Then Enter runtime argument value "gcsOverrideField" for key "OverRide"
495+ Then Enter runtime argument value "gcsMinSplitSize" for key "MinSplit"
496+ Then Enter runtime argument value "gcsMaxSplitSize" for key "MaxSplit"
497+ Then Enter runtime argument value "fileRegex" for key "FileReg"
498+ Then Enter runtime argument value "gcsPathField" for key "PathF"
499+ Then Enter runtime argument value "filenameOnly" for GCS source property path key "FilenameOnly"
500+ Then Enter runtime argument value "recursive" for GCS source property path key "ReadFilesRecursively"
501+ Then Enter runtime argument value "ignoreNonExistingFolders" for GCS source property path key "IgnoreNonExistingFolders"
502+ Then Enter runtime argument value "encrypted" for GCS source property path key "DataFileEncrypted"
503+ Then Enter runtime argument value "encryptedMetadataSuffix" for GCS source property path key "testmeta"
504+ Then Enter runtime argument value "gcsCSVFileSysProperty" for key "FileSystemPr"
505+ Then Enter runtime argument value "fileEncoding" for key "Encode"
506+ Then Enter runtime argument value "gcsPathFieldOutputSchema" for key "gcsOutputSchema"
507+ Then Enter runtime argument value "projectId" for key "bqProjectId"
508+ Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
509+ Then Enter runtime argument value "dataset" for key "bqDataset"
510+ Then Enter runtime argument value for BigQuery sink table name key "bqTargetTable"
511+ Then Enter runtime argument value "bqTruncateTableTrue" for key "bqTruncateTable"
512+ Then Enter runtime argument value "bqUpdateTableSchemaTrue" for key "bqUpdateTableSchema"
513+ Then Run the preview of pipeline with runtime arguments
514+ Then Wait till pipeline preview is in running state
515+ Then Open and capture pipeline preview logs
516+ Then Verify the preview run status of pipeline in the logs is "succeeded"
517+ Then Close the pipeline logs
518+ Then Close the preview
519+ Then Deploy the pipeline
520+ Then Run the Pipeline in Runtime
521+ Then Enter runtime argument value "projectId" for key "gcsProjectId"
522+ Then Enter runtime argument value "serviceAccountType" for key "serviceAccountType"
523+ Then Enter runtime argument value "serviceAccount" for key "serviceAccount"
524+ Then Enter runtime argument value "gcsCsvFile" for GCS source property path key "gcsSourcePath"
525+ Then Enter runtime argument value "gcsSkipHeaderTrue" for key "gcsSkipHeader"
526+ Then Enter runtime argument value "csvFormat" for key "gcsFormat"
527+ Then Enter runtime argument value "sampleSize" for key "SampleSize"
528+ Then Enter runtime argument value "gcsOverrideField" for key "OverRide"
529+ Then Enter runtime argument value "gcsMinSplitSize" for key "MinSplit"
530+ Then Enter runtime argument value "gcsMaxSplitSize" for key "MaxSplit"
531+ Then Enter runtime argument value "fileRegex" for key "FileReg"
532+ Then Enter runtime argument value "gcsPathField" for key "PathF"
533+ Then Enter runtime argument value "filenameOnly" for GCS source property path key "FilenameOnly"
534+ Then Enter runtime argument value "recursive" for GCS source property path key "ReadFilesRecursively"
535+ Then Enter runtime argument value "ignoreNonExistingFolders" for GCS source property path key "IgnoreNonExistingFolders"
536+ Then Enter runtime argument value "encrypted" for GCS source property path key "DataFileEncrypted"
537+ Then Enter runtime argument value "encryptedMetadataSuffix" for GCS source property path key "testmeta"
538+ Then Enter runtime argument value "gcsCSVFileSysProperty" for key "FileSystemPr"
539+ Then Enter runtime argument value "fileEncoding" for key "Encode"
540+ Then Enter runtime argument value "gcsPathFieldOutputSchema" for key "gcsOutputSchema"
541+ Then Enter runtime argument value "projectId" for key "bqProjectId"
542+ Then Enter runtime argument value "projectId" for key "bqDatasetProjectId"
543+ Then Enter runtime argument value "dataset" for key "bqDataset"
544+ Then Enter runtime argument value for BigQuery sink table name key "bqTargetTable"
545+ Then Enter runtime argument value "bqTruncateTableTrue" for key "bqTruncateTable"
546+ Then Enter runtime argument value "bqUpdateTableSchemaTrue" for key "bqUpdateTableSchema"
547+ Then Run the Pipeline in Runtime with runtime arguments
548+ Then Wait till pipeline is in running state
549+ Then Open and capture logs
550+ Then Verify the pipeline status is "Succeeded"
551+ Then Get count of no of records transferred to target BigQuery Table
552+ Then Validate the values of records transferred from GCS bucket file is equal to the values of target BigQuery table
0 commit comments