Skip to content

Commit c075136

Browse files
committed
debug
1 parent d991ce7 commit c075136

File tree

10 files changed

+17
-17
lines changed

10 files changed

+17
-17
lines changed

.github/workflows/e2e.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -40,7 +40,7 @@ jobs:
4040
)
4141
strategy:
4242
matrix:
43-
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy, datastore, bigtable]
43+
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy, bigtable, datastore]
4444
fail-fast: false
4545
steps:
4646
# Pinned 1.0.0 version

src/e2e-test/features/bigqueryexecute/BQExecute.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin
33

44
@BQ_SOURCE_TEST @BQ_SINK_TEST @BQ_EXECUTE_SQL @BQExecute_Required
55
Scenario: Verify Store results in a BigQuery Table functionality of BQExecute plugin
6-
Given Open Datafusion Project to configure pipeline
6+
Given Open Datafusion Project to configure pipelines
77
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
88
When Select plugin: "BigQuery Execute" from the plugins list as: "Conditions and Actions"
99
When Navigate to the properties page of plugin: "BigQuery Execute"

src/e2e-test/features/bigtable/BigTableToBigTable.feature

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -11,13 +11,13 @@
1111
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
1212
# License for the specific language governing permissions and limitations under
1313
# the License.
14-
@BigTable @BIGTABLE_SOURCE_TEST
14+
@BigTable
1515
Feature: BigTable source - Verification of BigTable to BigTable Successful Data Transfer
1616

17-
@BIGTABLE_SINK_TEST @bigtable_Required
17+
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST @bigtable_Required
1818
Scenario: To verify data is getting transferred from BigTable source table to BigTable sink table
1919
Given Open Datafusion Project to configure pipeline
20-
When Select plugin: "Bigtable" from the plugins list as: "Source"
20+
When Select plugin: "Bigtable" from the plugins list as: "Source"S
2121
When Expand Plugin group in the LHS plugins list: "Sink"
2222
When Select plugin: "Bigtable" from the plugins list as: "Sink"
2323
Then Connect plugins: "Bigtable" and "Bigtable2" to establish connection
@@ -51,7 +51,7 @@ Feature: BigTable source - Verification of BigTable to BigTable Successful Data
5151
Then Validate OUT record count is equal to IN record count
5252
Then Validate data transferred to target bigtable table with data of source bigtable table
5353

54-
@EXISTING_BIGTABLE_SINK
54+
@BIGTABLE_SOURCE_TEST @EXISTING_BIGTABLE_SINK
5555
Scenario: To verify data is getting transferred from BigTable source table to existing BigTable sink
5656
Given Open Datafusion Project to configure pipeline
5757
When Select plugin: "Bigtable" from the plugins list as: "Source"
@@ -88,7 +88,7 @@ Feature: BigTable source - Verification of BigTable to BigTable Successful Data
8888
Then Validate OUT record count is equal to IN record count
8989
Then Validate data transferred to existing target bigtable table with data of source bigtable table
9090

91-
@BIGTABLE_SINK_TEST
91+
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST
9292
Scenario: To verify data is getting transferred from unvalidated BigTable source table to BigTable sink table
9393
Given Open Datafusion Project to configure pipeline
9494
When Select plugin: "Bigtable" from the plugins list as: "Source"

src/e2e-test/features/datastore/runtime.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Feature: DataStore - Verification of Datastore to Datastore Successful Data Tran
1717

1818
@DATASTORE_SOURCE_ENTITY @datastore_Required
1919
Scenario: To verify data is getting transferred from Datastore to Datastore successfully using filter and custom index
20-
Given Open Datafusion Project to configure pipeline
20+
Given Open Datafusion Project to configure pipelines
2121
Then Select plugin: "Datastore" from the plugins list as: "Source"
2222
And Navigate to the properties page of plugin: "Datastore"
2323
Then Replace input plugin property: "project" with value: "projectId"

src/e2e-test/features/gcscopy/GCSCopy.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano
1717

1818
@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
1919
Scenario:Validate successful copy object from one bucket to another new bucket along with data validation with default subdirectory and overwrite toggle button as false.
20-
Given Open Datafusion Project to configure pipeline
20+
Given Open Datafusion Project to configurex pipeline
2121
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
2222
When Select plugin: "GCS Copy" from the plugins list as: "Conditions and Actions"
2323
When Navigate to the properties page of plugin: "GCS Copy"

src/e2e-test/features/gcscreate/GCSCreate.feature

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Feature: GCSCreate - Verification of GCS Create plugin
33

44
@GCS_CSV_TEST
55
Scenario: Verify GCSCreate successfully creates objects in the GCS bucket
6-
Given Open Datafusion Project to configure pipeline
6+
Given Open Datafusion Project to configure pipelines
77
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
88
When Select plugin: "GCS Create" from the plugins list as: "Conditions and Actions"
99
When Navigate to the properties page of plugin: "GCS Create"
@@ -31,7 +31,7 @@ Feature: GCSCreate - Verification of GCS Create plugin
3131
When Navigate to the properties page of plugin: "GCS Create"
3232
Then Enter the GCS Create property projectId "projectId"
3333
Then Enter the GCS Create property objects to create as path "gcsCsvFile"
34-
Then Select GCS Create property fail if objects exists as "true"
34+
Then Select GCS Create property fail if objects exists as s"true"
3535
Then Override Service account details if set in environment variables
3636
Then Validate "GCS Create" plugin properties
3737
Then Close the GCS Create properties

src/e2e-test/features/gcsdelete/GCSDelete.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ Feature: GCS Delete - Verification of GCS Delete plugin
1212
Then Override Service account details if set in environment variables
1313
Then Validate "GCS Delete" plugin properties
1414
Then Close the GCS Delete properties
15-
Then Save and Deploy Pipeline
15+
Then Save and Deploy Pipelines
1616
Then Run the Pipeline in Runtime
1717
Then Wait till pipeline is in running state
1818
Then Open and capture logs

src/e2e-test/features/gcsmove/GCSMove.feature

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,11 +3,11 @@ Feature:GCSMove - Verification of successful objects move from one bucket to ano
33

44
@CMEK @GCS_CSV_TEST @GCS_SINK_TEST
55
Scenario:Validate successful move object from one bucket to another new bucket
6-
Given Open Datafusion Project to configure pipeline
6+
Given Open Datafusion Project to configure pipelinex
77
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
88
When Select plugin: "GCS Move" from the plugins list as: "Conditions and Actions"
99
When Navigate to the properties page of plugin: "GCS Move"
10-
Then Enter GCSMove property projectId "projectId"
10+
Then Enter GCSMove property projectId "projectId"x
1111
Then Enter GCSMove property source path "gcsCsvFile"
1212
Then Enter GCSMove property destination path
1313
Then Override Service account details if set in environment variables

src/e2e-test/features/pubsub/sink/BQToPubSub.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -3,7 +3,7 @@ Feature: PubSub-Sink - Verification of BigQuery to PubSub successful data transf
33

44
@CMEK @BQ_SOURCE_TEST
55
Scenario: To verify data is getting transferred from BigQuery to PubSub successfully
6-
Given Open Datafusion Project to configure pipeline
6+
Given Open Datafusion Project to configure pipelines
77
When Source is BigQuery
88
When Sink is PubSub
99
Then Connect source as "BigQuery" and sink as "GooglePublisher" to establish connection

src/e2e-test/features/spanner/source/SpannertoGCS.feature

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22
Feature: Spanner Source - Verification of Spanner to GCS successful data transfer
33

44
@GCS_SINK_TEST @Spanner_Source_Required
5-
Scenario: Verify data is getting transferred from Spanner to GCS successfully
6-
Given Open Datafusion Project to configure pipeline
5+
Scenario: Verify data is getting transferred from Spanner to GCS successfullys
6+
Given Open Datafusion Project to configure pipelines
77
When Source is Spanner
88
When Sink is GCS
99
Then Connect source as "Spanner" and sink as "GCS" to establish connection

0 commit comments

Comments
 (0)