Skip to content

Commit bfd1ecc

Browse files
authored
Merge pull request #89 from cloudsufi/enable-e2e-tests
Enabling e2e tests on github actions
2 parents 5d97f68 + a67f6a0 commit bfd1ecc

File tree

11 files changed

+55
-51
lines changed

11 files changed

+55
-51
lines changed

.github/workflows/build.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -37,12 +37,12 @@ jobs:
3737
&& (github.event.action != 'labeled' || github.event.label.name == 'build')
3838
)
3939
steps:
40-
- uses: actions/checkout@v3
40+
- uses: actions/checkout@v4
4141
with:
4242
ref: ${{ github.event.workflow_run.head_sha }}
4343

4444
- name: Cache
45-
uses: actions/cache@v3
45+
uses: actions/cache@v4
4646
with:
4747
path: ~/.m2/repository
4848
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}

.github/workflows/e2e.yml

Lines changed: 28 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -15,33 +15,44 @@
1515
name: Build e2e tests
1616

1717
on:
18-
workflow_run:
19-
workflows:
20-
- Trigger build
21-
types:
22-
- completed
18+
push:
19+
branches: [ develop, release/* ]
20+
pull_request:
21+
branches: [ develop, release/* ]
22+
types: [opened, synchronize, reopened, labeled]
23+
workflow_dispatch:
2324

2425
jobs:
25-
build:
26+
build-e2e-tests:
2627
runs-on: k8s-runner-e2e
27-
28-
if: ${{ github.event.workflow_run.conclusion != 'skipped' }}
28+
# We allow builds:
29+
# 1) When triggered manually
30+
# 2) When it's a merge into a branch
31+
# 3) For PRs that are labeled as build and
32+
# - It's a code change
33+
# - A build label was just added
34+
# A bit complex, but prevents builds when other labels are manipulated
35+
if: >
36+
github.event_name == 'workflow_dispatch'
37+
|| github.event_name == 'push'
38+
|| (contains(github.event.pull_request.labels.*.name, 'build')
39+
&& (github.event.action != 'labeled' || github.event.label.name == 'build')
40+
)
2941
3042
steps:
31-
- uses: haya14busa/action-workflow_run-status@967ed83efa565c257675ed70cfe5231f062ddd94
32-
- uses: actions/checkout@v3
43+
- uses: actions/checkout@v4
3344
with:
3445
path: plugin
3546
ref: ${{ github.event.workflow_run.head_sha }}
3647

3748
- name: Checkout e2e test repo
38-
uses: actions/checkout@v3
49+
uses: actions/checkout@v4
3950
with:
4051
repository: cdapio/cdap-e2e-tests
4152
path: e2e
4253

4354
- name: Cache
44-
uses: actions/cache@v3
55+
uses: actions/cache@v4
4556
with:
4657
path: ~/.m2/repository
4758
key: ${{ runner.os }}-maven-${{ github.workflow }}-${{ hashFiles('**/pom.xml') }}
@@ -50,7 +61,7 @@ jobs:
5061
5162
- name: Get Secrets from GCP Secret Manager
5263
id: 'secrets'
53-
uses: 'google-github-actions/get-secretmanager-secrets@v0'
64+
uses: 'google-github-actions/get-secretmanager-secrets@v2'
5465
with:
5566
secrets: |-
5667
SERVICE_NOW_CLIENT_ID:cdapio-github-builds/SERVICE_NOW_CLIENT_ID
@@ -68,33 +79,20 @@ jobs:
6879
SERVICE_NOW_USERNAME: ${{ steps.secrets.outputs.SERVICE_NOW_USERNAME }}
6980
SERVICE_NOW_PASSWORD: ${{ steps.secrets.outputs.SERVICE_NOW_PASSWORD }}
7081

71-
- name: Upload report
72-
uses: actions/upload-artifact@v3
73-
if: always()
74-
with:
75-
name: Cucumber report
76-
path: ./plugin/target/cucumber-reports
77-
7882
- name: Upload debug files
79-
uses: actions/upload-artifact@v3
83+
uses: actions/upload-artifact@v4
8084
if: always()
8185
with:
8286
name: Debug files
8387
path: ./**/target/e2e-debug
8488

8589
- name: Upload reports to GCS
86-
uses: google-github-actions/upload-cloud-storage@v0
90+
uses: google-github-actions/upload-cloud-storage@v2
8791
if: always()
8892
with:
8993
path: ./plugin/target/cucumber-reports
9094
destination: e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }}
9195

92-
- name: github-status-action
93-
uses: Sibz/github-status-action@67af1f4042a5a790681aad83c44008ca6cfab83d
96+
- name: Cucumber Report URL
9497
if: always()
95-
with:
96-
authToken: ${{ secrets.GITHUB_TOKEN }}
97-
state: success
98-
context: Cucumber report
99-
sha: ${{github.event.pull_request.head.sha || github.sha}}
100-
98+
run: echo "https://storage.googleapis.com/e2e-tests-cucumber-reports/${{ github.event.repository.name }}/${{ github.ref }}/cucumber-reports/advanced-reports/cucumber-html-reports/overview-features.html"

pom.xml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -531,7 +531,7 @@
531531
<dependency>
532532
<groupId>io.cdap.tests.e2e</groupId>
533533
<artifactId>cdap-e2e-framework</artifactId>
534-
<version>0.3.0-SNAPSHOT</version>
534+
<version>0.5.0-SNAPSHOT</version>
535535
<scope>test</scope>
536536
</dependency>
537537
</dependencies>

src/e2e-test/features/servicenowmultisource/DesignTimeValidation.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -36,7 +36,7 @@ Feature: ServiceNow Multi Source - Design time validation scenarios
3636
| INVALID_TABLE |
3737
And fill Credentials section for pipeline user
3838
And Click on the Validate button
39-
Then Verify that the Plugin Property: "tableNames" is displaying an in-line error message: "invalid.property.tablename"
39+
Then Verify that the Plugin is displaying an error message: "invalid.property.tablename" on the header
4040

4141
@TS-SN-MULTI-DSGN-ERROR-03
4242
Scenario: Verify validation message for Start date and End date in invalid format

src/e2e-test/features/servicenowmultisource/RunTimeWithMacros.feature

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -106,7 +106,7 @@ Feature: ServiceNow Multi Source - Run time scenarios (macro)
106106
And Verify the pipeline status is "Failed"
107107
Then Open Pipeline logs and verify Log entries having below listed Level and Message:
108108
| Level | Message |
109-
| ERROR | invalid.tablenames.logsmessage |
109+
| ERROR | invalid.tablename.logsmessage |
110110

111111
@TS-SN-RNTM-MACRO-04 @BQ_SINK
112112
Scenario: Verify pipeline failure message in logs when user provides invalid Advanced Properties with Macros

src/e2e-test/features/servicenowsource/RunTime.feature

Lines changed: 7 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -19,28 +19,29 @@
1919
Feature: ServiceNow Source - Run time scenarios
2020

2121
@TS-SN-RNTM-1 @SN_SOURCE_CONFIG @SN_RECEIVING_SLIP_LINE @BQ_SINK
22-
Scenario: Verify user should be able to preview the pipeline where ServiceNow source is configured for Table mode
22+
Scenario: Verify user should be able to preview the pipeline where ServiceNow source is configured for Table mode with value type display
2323
When Open Datafusion Project to configure pipeline
2424
And Select plugin: "ServiceNow" from the plugins list as: "Source"
2525
And Navigate to the properties page of plugin: "ServiceNow"
2626
And configure ServiceNow source plugin for table: "RECEIVING_SLIP_LINE" in the Table mode
2727
And fill Credentials section for pipeline user
28+
And Select dropdown plugin property: "valueType" with option value: "Display"
2829
And Enter input plugin property: "startDate" with value: "start.date"
2930
And Enter input plugin property: "endDate" with value: "end.date"
3031
Then Validate "ServiceNow" plugin properties
3132
And Capture the generated Output Schema
3233
And Close the Plugin Properties page
3334
And Select Sink plugin: "BigQueryTable" from the plugins list
34-
And Connect source as "ServiceNow" and sink as "BigQuery" to establish connection
35+
And Connect source as "ServiceNow" and sink as "BigQueryTable" to establish connection
3536
And Navigate to the properties page of plugin: "BigQuery"
3637
And Replace input plugin property: "project" with value: "projectId"
37-
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
38+
Then Enter input plugin property: "datasetProject" with value: "datasetprojectId"
3839
And Configure BigQuery sink plugin for Dataset and Table
3940
Then Validate "BigQuery" plugin properties
4041
And Close the Plugin Properties page
4142
And Preview and run the pipeline
4243
Then Verify the preview of pipeline is "success"
43-
And Click on the Preview Data link on the Sink plugin node: "BigQueryTable"
44+
And Click on the Preview Data link on the Sink plugin node: "BigQuery"
4445
Then Verify sink plugin's Preview Data for Input Records table and the Input Schema matches the Output Schema of Source plugin
4546

4647
@TS-SN-RNTM-2 @SN_SOURCE_CONFIG @SN_RECEIVING_SLIP_LINE @BQ_SINK
@@ -55,7 +56,7 @@ Feature: ServiceNow Source - Run time scenarios
5556
Then Validate "ServiceNow" plugin properties
5657
And Close the Plugin Properties page
5758
And Select Sink plugin: "BigQueryTable" from the plugins list
58-
And Connect source as "ServiceNow" and sink as "BigQuery" to establish connection
59+
And Connect source as "ServiceNow" and sink as "BigQueryTable" to establish connection
5960
And Navigate to the properties page of plugin: "BigQuery"
6061
And Replace input plugin property: "project" with value: "projectId"
6162
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"
@@ -88,7 +89,7 @@ Feature: ServiceNow Source - Run time scenarios
8889
Then Validate "ServiceNow" plugin properties
8990
And Close the Plugin Properties page
9091
And Select Sink plugin: "BigQueryTable" from the plugins list
91-
And Connect source as "ServiceNow" and sink as "BigQuery" to establish connection
92+
And Connect source as "ServiceNow" and sink as "BigQueryTable" to establish connection
9293
And Navigate to the properties page of plugin: "BigQuery"
9394
And Replace input plugin property: "project" with value: "projectId"
9495
And Enter input plugin property: "datasetProject" with value: "datasetprojectId"

src/e2e-test/java/io/cdap/plugin/bigquery/stepsdesign/BigQueryCommonSteps.java

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -36,20 +36,21 @@ public class BigQueryCommonSteps {
3636
public void configureBqSinkPlugin() {
3737
String referenceName = "Test" + RandomStringUtils.randomAlphanumeric(10);
3838
CdfBigQueryPropertiesActions.enterBigQueryReferenceName(referenceName);
39-
CdfBigQueryPropertiesActions.enterBigQueryDataset(TestSetupHooks.bqTargetDataset);
39+
CdfBigQueryPropertiesActions.enterBigQueryDataset(PluginPropertyUtils.pluginProp("dataset"));
4040
CdfBigQueryPropertiesActions.enterBigQueryTable(TestSetupHooks.bqTargetTable);
4141
}
4242

4343
@When("Configure BigQuery Multi Table sink plugin for Dataset")
4444
public void configureBqMultiTableSinkPlugin() {
4545
String referenceName = "Test" + RandomStringUtils.randomAlphanumeric(10);
4646
CdfBigQueryPropertiesActions.enterBigQueryReferenceName(referenceName);
47-
CdfBigQueryPropertiesActions.enterBigQueryDataset(PluginPropertyUtils.pluginProp("bq.target.dataset2"));
47+
CdfBigQueryPropertiesActions.enterBigQueryDataset(PluginPropertyUtils.pluginProp("dataset"));
4848
}
4949

5050
@Then("Verify count of no of records transferred to the target BigQuery Table")
5151
public void getCountOfNoOfRecordsTransferredToTargetBigQueryTable() throws IOException, InterruptedException {
52-
int countRecords = BigQueryClient.countBqQuery(TestSetupHooks.bqTargetDataset, TestSetupHooks.bqTargetTable);
52+
int countRecords = BigQueryClient.countBqQuery(PluginPropertyUtils.pluginProp("dataset"),
53+
TestSetupHooks.bqTargetTable);
5354
Assert.assertEquals("Number of records transferred to BigQuery should be equal to " +
5455
"records out count displayed on the Source plugin: ",
5556
countRecords, CdfPipelineRunAction.getCountDisplayedOnSourcePluginAsRecordsOut());
@@ -67,7 +68,7 @@ public void configureBqSourcePlugin() throws IOException, InterruptedException {
6768
CdfBigQueryPropertiesActions.enterDatasetProjectId(datasetProjectId);
6869
CdfBigQueryPropertiesActions.enterProjectId(projectId);
6970
CdfBigQueryPropertiesActions.enterBigQueryReferenceName(referenceName);
70-
CdfBigQueryPropertiesActions.enterBigQueryDataset(TestSetupHooks.bqSourceDataset);
71+
CdfBigQueryPropertiesActions.enterBigQueryDataset(PluginPropertyUtils.pluginProp("dataset"));
7172
CdfBigQueryPropertiesActions.enterBigQueryTable(TestSetupHooks.bqSourceTable);
7273
}
7374
}

src/e2e-test/java/io/cdap/plugin/tests/hooks/TestSetupHooks.java

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -81,6 +81,7 @@ public static void createRecordInReceivingSlipLineTable()
8181
String recordDetails = "{'number':'" + uniqueId + "'}";
8282
StringEntity entity = new StringEntity(recordDetails);
8383
systemId = tableAPIClient.createRecord(TablesInTableMode.RECEIVING_SLIP_LINE.value, entity);
84+
BeforeActions.scenario.write("New Record in Receiving Slip Line table: " + systemId + " created successfully");
8485
}
8586

8687
@Before(order = 2, value = "@SN_UPDATE_AGENT_ASSIST_RECOMMENDATION")
@@ -234,6 +235,7 @@ public static void updateTempSourceBQTableForServiceOffering() throws IOExceptio
234235
public static void setTempTargetBQTable() {
235236
bqTargetTable = "TestSN_table" + RandomStringUtils.randomAlphanumeric(10);
236237
BeforeActions.scenario.write("BigQuery Target table name: " + bqTargetTable);
238+
PluginPropertyUtils.addPluginProp("bqTargetTable", bqTargetTable);
237239
}
238240

239241
@Before(order = 1, value = "@CONNECTION")

src/e2e-test/resources/errorMessage.properties

Lines changed: 4 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,14 +2,13 @@
22
validationSuccessMessage=No errors found.
33

44
#Invalid value
5-
invalid.property.tablename=Bad Request. Table:
5+
invalid.property.tablename=ServiceNow API returned an unexpected result or the specified table may not exist. Cause: Http call to ServiceNow instance returned status code 400. Ensure specified table exists in the datasource.
66
invalid.property.startdate=Invalid format for Start date. Correct Format: yyyy-MM-dd
77
invalid.property.enddate=Invalid format for End date. Correct Format: yyyy-MM-dd
88
invalid.property.credentials=Unable to connect to ServiceNow Instance. Ensure properties like Client ID, Client Secret, API Endpoint, User Name, Password are correct.
99

1010
#Logs error message
11-
invalid.tablename.logsmessage=Spark program 'phase-1' failed with error: Errors were encountered during validation. Bad Request. Table:
12-
invalid.credentials.logsmessage=Spark program 'phase-1' failed with error: Errors were encountered during validation. Unable to connect to ServiceNow Instance.. Please check the system logs for more details.
13-
invalid.filters.logsmessage=Spark program 'phase-1' failed with error: Errors were encountered during validation. Invalid format for Start date. Correct Format: yyyy-MM-dd. Please check the system logs for more details.
14-
invalid.tablenames.logsmessage=Spark program 'phase-1' failed with error: Errors were encountered during validation. Bad Request. Table: blahblah is invalid.. Please check the system logs for more details.
11+
invalid.tablename.logsmessage=ServiceNow API returned an unexpected result or the specified table may not exist.
12+
invalid.credentials.logsmessage=Errors were encountered during validation. Unable to connect to ServiceNow Instance.. Please check the system logs for more details.
13+
invalid.filters.logsmessage=Errors were encountered during validation. Invalid format for Start date. Correct Format: yyyy-MM-dd. Please check the system logs for more details.
1514
invalid.testconnection.logmessage=Unable to connect to ServiceNow Instance. Ensure properties like Client ID, Client Secret, API Endpoint, User Name, Password are correct.

src/e2e-test/resources/pluginParameters.properties

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,8 @@ projectId=cdf-athena
4040
datasetprojectId=cdf-athena
4141
bq.target.dataset=SN_test_automation
4242
bq.target.dataset2=SN_Test_atm
43+
dataset=testbq_bqmt
44+
bqTargetTable=dummy
4345

4446
##ServiceNowSink
4547
INSERT=insert

0 commit comments

Comments
 (0)