Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/e2e.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ jobs:
)
strategy:
matrix:
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy, datastore, bigtable]
tests: [bigquery, common, gcs, pubsub, spanner, gcscreate, gcsdelete, gcsmove, bigqueryexecute, gcscopy, bigtable, datastore]
fail-fast: false
steps:
# Pinned 1.0.0 version
Expand Down
2 changes: 2 additions & 0 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -1247,11 +1247,13 @@
${SERVICE_ACCOUNT_JSON}
</SERVICE_ACCOUNT_JSON>
</environmentVariables>
<testFailureIgnore>false</testFailureIgnore>
</configuration>
<executions>
<execution>
<goals>
<goal>integration-test</goal>
<goal>verify</goal>
</goals>
</execution>
</executions>
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/bigqueryexecute/BQExecute.feature
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Feature: BigQueryExecute - Verify data transfer using BigQuery Execute plugin
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "BigQuery Execute" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "BigQuery Execute"
When Navigate to the properties page of plugin: "BigQuerhgfhgfghhg"
Then Replace input plugin property: "projectId" with value: "projectId"
Then Enter textarea plugin property: "sql" with value: "bqExecuteQuery"
Then Click plugin property: "storeResultsInBigQueryTable"
Expand Down
10 changes: 5 additions & 5 deletions src/e2e-test/features/bigtable/BigTableToBigTable.feature
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@
# WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the
# License for the specific language governing permissions and limitations under
# the License.
@BigTable @BIGTABLE_SOURCE_TEST
@BigTable
Feature: BigTable source - Verification of BigTable to BigTable Successful Data Transfer

@BIGTABLE_SINK_TEST @bigtable_Required
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST @bigtable_Required
Scenario: To verify data is getting transferred from BigTable source table to BigTable sink table
Given Open Datafusion Project to configure pipeline
When Select plugin: "Bigtable" from the plugins list as: "Source"
When Select plugin: "Bigtable" from the plugins list as: "Source"S
When Expand Plugin group in the LHS plugins list: "Sink"
When Select plugin: "Bigtable" from the plugins list as: "Sink"
Then Connect plugins: "Bigtable" and "Bigtable2" to establish connection
Expand Down Expand Up @@ -51,7 +51,7 @@ Feature: BigTable source - Verification of BigTable to BigTable Successful Data
Then Validate OUT record count is equal to IN record count
Then Validate data transferred to target bigtable table with data of source bigtable table

@EXISTING_BIGTABLE_SINK
@BIGTABLE_SOURCE_TEST @EXISTING_BIGTABLE_SINK
Scenario: To verify data is getting transferred from BigTable source table to existing BigTable sink
Given Open Datafusion Project to configure pipeline
When Select plugin: "Bigtable" from the plugins list as: "Source"
Expand Down Expand Up @@ -88,7 +88,7 @@ Feature: BigTable source - Verification of BigTable to BigTable Successful Data
Then Validate OUT record count is equal to IN record count
Then Validate data transferred to existing target bigtable table with data of source bigtable table

@BIGTABLE_SINK_TEST
@BIGTABLE_SOURCE_TEST @BIGTABLE_SINK_TEST
Scenario: To verify data is getting transferred from unvalidated BigTable source table to BigTable sink table
Given Open Datafusion Project to configure pipeline
When Select plugin: "Bigtable" from the plugins list as: "Source"
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/datastore/runtime.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Feature: DataStore - Verification of Datastore to Datastore Successful Data Tran

@DATASTORE_SOURCE_ENTITY @datastore_Required
Scenario: To verify data is getting transferred from Datastore to Datastore successfully using filter and custom index
Given Open Datafusion Project to configure pipeline
Given Open Datafusion Project to configure pipelines
Then Select plugin: "Datastore" from the plugins list as: "Source"
And Navigate to the properties page of plugin: "Datastore"
Then Replace input plugin property: "project" with value: "projectId"
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcscopy/GCSCopy.feature
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Feature:GCSCopy - Verification of successful objects copy from one bucket to ano

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST @GCSCopy_Required @ITN_TEST
Scenario:Validate successful copy object from one bucket to another new bucket along with data validation with default subdirectory and overwrite toggle button as false.
Given Open Datafusion Project to configure pipeline
Given Open Datafusion Project to configurex pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Copy" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Copy"
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/gcscreate/GCSCreate.feature
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ Feature: GCSCreate - Verification of GCS Create plugin

@GCS_CSV_TEST
Scenario: Verify GCSCreate successfully creates objects in the GCS bucket
Given Open Datafusion Project to configure pipeline
Given Open Datafusion Project to configure pipelines
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Create" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Create"
Expand Down Expand Up @@ -31,7 +31,7 @@ Feature: GCSCreate - Verification of GCS Create plugin
When Navigate to the properties page of plugin: "GCS Create"
Then Enter the GCS Create property projectId "projectId"
Then Enter the GCS Create property objects to create as path "gcsCsvFile"
Then Select GCS Create property fail if objects exists as "true"
Then Select GCS Create property fail if objects exists as "truhjhjjhjhe"
Then Override Service account details if set in environment variables
Then Validate "GCS Create" plugin properties
Then Close the GCS Create properties
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/gcsdelete/GCSDelete.feature
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Feature: GCS Delete - Verification of GCS Delete plugin
Given Open Datafusion Project to configure pipeline
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Delete" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Delete"
When Navigate to the properties page of plugin: "GCS Dkjnkjjjelete"
Then Enter the GCS Delete property projectId "projectId"
Then Enter the GCS Delete property objects to delete as bucketName
Then Override Service account details if set in environment variables
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/gcsmove/GCSMove.feature
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,11 @@ Feature:GCSMove - Verification of successful objects move from one bucket to ano

@CMEK @GCS_CSV_TEST @GCS_SINK_TEST
Scenario:Validate successful move object from one bucket to another new bucket
Given Open Datafusion Project to configure pipeline
Given Open Datafusion Project to configure pipelinex
When Expand Plugin group in the LHS plugins list: "Conditions and Actions"
When Select plugin: "GCS Move" from the plugins list as: "Conditions and Actions"
When Navigate to the properties page of plugin: "GCS Move"
Then Enter GCSMove property projectId "projectId"
Then Enter GCSMove property projectId "projectId"x
Then Enter GCSMove property source path "gcsCsvFile"
Then Enter GCSMove property destination path
Then Override Service account details if set in environment variables
Expand Down
2 changes: 1 addition & 1 deletion src/e2e-test/features/pubsub/sink/BQToPubSub.feature
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Feature: PubSub-Sink - Verification of BigQuery to PubSub successful data transf
Given Open Datafusion Project to configure pipeline
When Source is BigQuery
When Sink is PubSub
Then Connect source as "BigQuery" and sink as "GooglePublisher" to establish connection
Then Connect source as "nbjhbj" and sink as "GooglePublisher" to establish connection
Then Open BigQuery source properties
Then Override Service account details if set in environment variables
Then Enter the BigQuery source mandatory properties
Expand Down
4 changes: 2 additions & 2 deletions src/e2e-test/features/spanner/source/SpannertoGCS.feature
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@
Feature: Spanner Source - Verification of Spanner to GCS successful data transfer

@GCS_SINK_TEST @Spanner_Source_Required
Scenario: Verify data is getting transferred from Spanner to GCS successfully
Scenario: Verify data is getting transferred from Spanner to GCS successfullys
Given Open Datafusion Project to configure pipeline
When Source is Spanner
When Sink is GCS
Then Connect source as "Spanner" and sink as "GCS" to establish connection
Then Connect source as "Spanjbhjbner" and sink as "GCS" to establish connection
Then Open Spanner source properties
Then Enter Spanner property reference name
Then Enter Spanner property projectId "projectId"
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/bigtable",
"json:target/cucumber-reports/cucumber-bigtable.json",
"junit:target/cucumber-reports/cucumber-bigtable.xml"}
"junit:target/cucumber-reports/cucumber-bigtable.xml"},
strict = true // Fail on undefined steps
)
public class TestRunner {
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/datastore",
"json:target/cucumber-reports/cucumber-datastore.json",
"junit:target/cucumber-reports/cucumber-datastore.xml"}
"junit:target/cucumber-reports/cucumber-datastore.xml"},
strict = true // Fail on undefined steps
)
public class TestRunner {
}
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,8 @@
monochrome = true,
plugin = {"pretty", "html:target/cucumber-html-report/gcscopy-action",
"json:target/cucumber-reports/cucumber-gcscopy-action.json",
"junit:target/cucumber-reports/cucumber-gcscopy-action.xml"}
"junit:target/cucumber-reports/cucumber-gcscopy-action.xml"},
strict = true // Fail on undefined steps
)
public class TestRunner {
}
Loading