Skip to content

Commit 83d1c42

Browse files
committed
e2e tests Oracle source
1 parent 9f13a64 commit 83d1c42

File tree

6 files changed

+219
-6
lines changed

6 files changed

+219
-6
lines changed

oracle-plugin/src/e2e-test/features/source/OracleDesignTimeValidation.feature

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -199,3 +199,18 @@ Feature: Oracle source- Verify Oracle source plugin design time validation scena
199199
Then Enter textarea plugin property: "importQuery" with value: "invalidImportQuery"
200200
Then Click on the Validate button
201201
Then Verify that the Plugin Property: "user" is displaying an in-line error message: "errorMessageBlankUsername"
202+
203+
@Oracle_Required
204+
Scenario: Verify the validation error message on header with blank database value
205+
Given Open Datafusion Project to configure pipeline
206+
When Expand Plugin group in the LHS plugins list: "Source"
207+
When Select plugin: "Oracle" from the plugins list as: "Source"
208+
Then Navigate to the properties page of plugin: "Oracle"
209+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
210+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
211+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
212+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
213+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
214+
Then Click plugin property: "switch-useConnection"
215+
Then Click on the Validate button
216+
Then Verify that the Plugin is displaying an error message: "blank.database.message" on the header

oracle-plugin/src/e2e-test/features/source/OracleRunTime.feature

Lines changed: 107 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -438,3 +438,110 @@ Feature: Oracle - Verify data transfer from Oracle source to BigQuery sink
438438
Then Verify the pipeline status is "Succeeded"
439439
Then Close the pipeline logs
440440
Then Validate the values of records transferred to target Big Query table is equal to the values from source table
441+
442+
@ORACLE_SOURCE_TEST @BQ_SINK_TEST @Oracle_Required
443+
Scenario: To verify data is getting transferred from Oracle source to BigQuery sink successfully with bounding query
444+
Given Open Datafusion Project to configure pipeline
445+
When Expand Plugin group in the LHS plugins list: "Source"
446+
When Select plugin: "Oracle" from the plugins list as: "Source"
447+
When Expand Plugin group in the LHS plugins list: "Sink"
448+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
449+
Then Connect plugins: "Oracle" and "BigQuery" to establish connection
450+
Then Navigate to the properties page of plugin: "Oracle"
451+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
452+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
453+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
454+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
455+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
456+
Then Select radio button plugin property: "connectionType" with value: "service"
457+
Then Select radio button plugin property: "role" with value: "normal"
458+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
459+
Then Replace input plugin property: "database" with value: "databaseName"
460+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
461+
Then Enter textarea plugin property: "boundingQuery" with value: "boundingQuery"
462+
Then Click on the Get Schema button
463+
Then Verify the Output Schema matches the Expected Schema: "outputSchema"
464+
Then Validate "Oracle" plugin properties
465+
Then Close the Plugin Properties page
466+
Then Navigate to the properties page of plugin: "BigQuery"
467+
Then Replace input plugin property: "project" with value: "projectId"
468+
Then Enter input plugin property: "datasetProject" with value: "projectId"
469+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
470+
Then Enter input plugin property: "dataset" with value: "dataset"
471+
Then Enter input plugin property: "table" with value: "bqTargetTable"
472+
Then Click plugin property: "truncateTable"
473+
Then Click plugin property: "updateTableSchema"
474+
Then Validate "BigQuery" plugin properties
475+
Then Close the Plugin Properties page
476+
Then Save the pipeline
477+
Then Preview and run the pipeline
478+
Then Wait till pipeline preview is in running state
479+
Then Open and capture pipeline preview logs
480+
Then Verify the preview run status of pipeline in the logs is "succeeded"
481+
Then Close the pipeline logs
482+
Then Close the preview
483+
Then Deploy the pipeline
484+
Then Run the Pipeline in Runtime
485+
Then Wait till pipeline is in running state
486+
Then Open and capture logs
487+
Then Verify the pipeline status is "Succeeded"
488+
Then Close the pipeline logs
489+
Then Validate the values of records transferred to target Big Query table is equal to the values from source table
490+
491+
@ORACLE_SOURCE_TEST @BQ_SINK_TEST @CONNECTION @Oracle_Required
492+
Scenario: To verify data is getting transferred from Oracle source to BigQuery sink successfully with use connection
493+
Given Open Datafusion Project to configure pipeline
494+
When Expand Plugin group in the LHS plugins list: "Source"
495+
When Select plugin: "Oracle" from the plugins list as: "Source"
496+
When Expand Plugin group in the LHS plugins list: "Sink"
497+
When Select plugin: "BigQuery" from the plugins list as: "Sink"
498+
Then Connect plugins: "Oracle" and "BigQuery" to establish connection
499+
Then Navigate to the properties page of plugin: "Oracle"
500+
And Click plugin property: "switch-useConnection"
501+
And Click on the Browse Connections button
502+
And Click on the Add Connection button
503+
Then Click plugin property: "connector-Oracle"
504+
And Enter input plugin property: "name" with value: "connection.name"
505+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
506+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
507+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
508+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
509+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
510+
Then Select radio button plugin property: "connectionType" with value: "service"
511+
Then Replace input plugin property: "database" with value: "databaseName"
512+
Then Select radio button plugin property: "role" with value: "normal"
513+
Then Click on the Test Connection button
514+
And Verify the test connection is successful
515+
Then Click on the Create button
516+
Then Select connection: "connection.name"
517+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
518+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
519+
Then Click on the Get Schema button
520+
Then Verify the Output Schema matches the Expected Schema: "outputSchema"
521+
Then Validate "Oracle" plugin properties
522+
Then Close the Plugin Properties page
523+
Then Navigate to the properties page of plugin: "BigQuery"
524+
Then Replace input plugin property: "project" with value: "projectId"
525+
Then Enter input plugin property: "datasetProject" with value: "projectId"
526+
Then Enter input plugin property: "referenceName" with value: "BQReferenceName"
527+
Then Enter input plugin property: "dataset" with value: "dataset"
528+
Then Enter input plugin property: "table" with value: "bqTargetTable"
529+
Then Click plugin property: "truncateTable"
530+
Then Click plugin property: "updateTableSchema"
531+
Then Validate "BigQuery" plugin properties
532+
Then Close the Plugin Properties page
533+
Then Save the pipeline
534+
Then Preview and run the pipeline
535+
Then Wait till pipeline preview is in running state
536+
Then Open and capture pipeline preview logs
537+
Then Verify the preview run status of pipeline in the logs is "succeeded"
538+
Then Close the pipeline logs
539+
Then Close the preview
540+
Then Deploy the pipeline
541+
Then Run the Pipeline in Runtime
542+
Then Wait till pipeline is in running state
543+
Then Open and capture logs
544+
Then Verify the pipeline status is "Succeeded"
545+
Then Close the pipeline logs
546+
Then Validate the values of records transferred to target Big Query table is equal to the values from source table
547+

oracle-plugin/src/e2e-test/features/source/OracleRunTimeMacro.feature

Lines changed: 61 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -305,3 +305,64 @@ Feature: Oracle - Verify Oracle plugin data transfer with macro arguments
305305
Then Verify the pipeline status is "Succeeded"
306306
Then Close the pipeline logs
307307
Then Validate the values of records transferred to target Big Query table is equal to the values from source table
308+
309+
@ORACLE_SOURCE_TEST @ORACLE_TARGET_TEST @Oracle_Required
310+
Scenario: To verify data is getting transferred from Oracle to Oracle successfully when connection arguments,Isolation level,bounding query are macro enabled
311+
Given Open Datafusion Project to configure pipeline
312+
When Expand Plugin group in the LHS plugins list: "Source"
313+
When Select plugin: "Oracle" from the plugins list as: "Source"
314+
When Expand Plugin group in the LHS plugins list: "Sink"
315+
When Select plugin: "Oracle" from the plugins list as: "Sink"
316+
Then Connect plugins: "Oracle" and "Oracle2" to establish connection
317+
Then Navigate to the properties page of plugin: "Oracle"
318+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
319+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
320+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
321+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
322+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
323+
Then Select radio button plugin property: "connectionType" with value: "service"
324+
Then Select radio button plugin property: "role" with value: "normal"
325+
Then Enter textarea plugin property: "importQuery" with value: "selectQuery"
326+
Then Enter input plugin property: "referenceName" with value: "sourceRef"
327+
Then Click on the Macro button of Property: "connectionArguments" and set the value to: "connArgumentsSource"
328+
Then Click on the Macro button of Property: "transactionIsolationLevel" and set the value to: "defaultTransactionIsolationLevel"
329+
Then Replace input plugin property: "database" with value: "databaseName"
330+
Then Click on the Macro button of Property: "boundingQuery" and set the value in textarea: "oracleBoundingQuery"
331+
Then Validate "Oracle" plugin properties
332+
Then Close the Plugin Properties page
333+
Then Navigate to the properties page of plugin: "Oracle2"
334+
Then Select dropdown plugin property: "select-jdbcPluginName" with option value: "driverName"
335+
Then Replace input plugin property: "host" with value: "host" for Credentials and Authorization related fields
336+
Then Replace input plugin property: "port" with value: "port" for Credentials and Authorization related fields
337+
Then Replace input plugin property: "database" with value: "databaseName"
338+
Then Replace input plugin property: "tableName" with value: "targetTable"
339+
Then Replace input plugin property: "dbSchemaName" with value: "schema"
340+
Then Replace input plugin property: "user" with value: "username" for Credentials and Authorization related fields
341+
Then Replace input plugin property: "password" with value: "password" for Credentials and Authorization related fields
342+
Then Enter input plugin property: "referenceName" with value: "targetRef"
343+
Then Select radio button plugin property: "connectionType" with value: "service"
344+
Then Select radio button plugin property: "role" with value: "normal"
345+
Then Validate "Oracle2" plugin properties
346+
Then Close the Plugin Properties page
347+
Then Save the pipeline
348+
Then Preview and run the pipeline
349+
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
350+
Then Enter runtime argument value "boundingQuery" for key "oracleBoundingQuery"
351+
Then Enter runtime argument value "transactionIsolationLevel" for key "defaultTransactionIsolationLevel"
352+
Then Run the preview of pipeline with runtime arguments
353+
Then Wait till pipeline preview is in running state
354+
Then Open and capture pipeline preview logs
355+
Then Verify the preview run status of pipeline in the logs is "succeeded"
356+
Then Close the pipeline logs
357+
Then Close the preview
358+
Then Deploy the pipeline
359+
Then Run the Pipeline in Runtime
360+
Then Enter runtime argument value "connectionArguments" for key "connArgumentsSource"
361+
Then Enter runtime argument value "boundingQuery" for key "oracleBoundingQuery"
362+
Then Enter runtime argument value "transactionIsolationLevel" for key "defaultTransactionIsolationLevel"
363+
Then Run the Pipeline in Runtime with runtime arguments
364+
Then Wait till pipeline is in running state
365+
Then Open and capture logs
366+
Then Verify the pipeline status is "Succeeded"
367+
Then Close the pipeline logs
368+
Then Validate the values of records transferred to target table is equal to the values from source table

oracle-plugin/src/e2e-test/java/io.cdap.plugin/common.stepsdesign/TestSetupHooks.java

Lines changed: 27 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,6 +17,8 @@
1717
package io.cdap.plugin.common.stepsdesign;
1818

1919
import com.google.cloud.bigquery.BigQueryException;
20+
import io.cdap.e2e.pages.actions.CdfConnectionActions;
21+
import io.cdap.e2e.pages.actions.CdfPluginPropertiesActions;
2022
import io.cdap.e2e.utils.BigQueryClient;
2123
import io.cdap.e2e.utils.PluginPropertyUtils;
2224
import io.cdap.plugin.OracleClient;
@@ -48,8 +50,10 @@ public static void setTableName() {
4850
PluginPropertyUtils.addPluginProp("sourceTable", sourceTableName);
4951
PluginPropertyUtils.addPluginProp("targetTable", targetTableName);
5052
String schema = PluginPropertyUtils.pluginProp("schema");
51-
PluginPropertyUtils.addPluginProp("selectQuery", String.format("select * from %s.%s", schema,
52-
sourceTableName));
53+
PluginPropertyUtils.addPluginProp("selectQuery", String.format("select * from %s.%s "
54+
+ "WHERE $CONDITIONS", schema, sourceTableName));
55+
PluginPropertyUtils.addPluginProp("boundingQuery", String.format("select MIN(ID),MAX(ID)"
56+
+ " from %s.%s", schema, sourceTableName));
5357
}
5458

5559
@Before(order = 2, value = "@ORACLE_SOURCE_TEST")
@@ -416,4 +420,25 @@ public static void dropOracleTargetDateTable() throws SQLException, ClassNotFoun
416420
BeforeActions.scenario.write("Oracle Target Table - " + PluginPropertyUtils.pluginProp("targetTable")
417421
+ " deleted successfully");
418422
}
423+
424+
@Before(order = 1, value = "@CONNECTION")
425+
public static void setNewConnectionName() {
426+
String connectionName = "Oracle" + RandomStringUtils.randomAlphanumeric(10);
427+
PluginPropertyUtils.addPluginProp("connection.name", connectionName);
428+
BeforeActions.scenario.write("New Connection name: " + connectionName);
429+
}
430+
431+
private static void deleteConnection(String connectionType, String connectionName) throws IOException {
432+
CdfConnectionActions.openWranglerConnectionsPage();
433+
CdfConnectionActions.expandConnections(connectionType);
434+
CdfConnectionActions.openConnectionActionMenu(connectionType, connectionName);
435+
CdfConnectionActions.selectConnectionAction(connectionType, connectionName, "Delete");
436+
CdfPluginPropertiesActions.clickPluginPropertyButton("Delete");
437+
}
438+
439+
@After(order = 1, value = "@CONNECTION")
440+
public static void deleteBQConnection() throws IOException {
441+
deleteConnection("Oracle", "connection.name");
442+
PluginPropertyUtils.removePluginProp("connection.name");
443+
}
419444
}

oracle-plugin/src/e2e-test/resources/errorMessage.properties

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -17,3 +17,4 @@ errorMessageInvalidSinkDatabase=Exception while trying to validate schema of dat
1717
errorMessageInvalidHost=Exception while trying to validate schema of database table '"table"' for connection
1818
errorLogsMessageInvalidBoundingQuery=Spark program 'phase-1' failed with error: ORA-00936: missing expression . \
1919
Please check the system logs for more details.
20+
blank.database.message=Required property 'database' has no value.

oracle-plugin/src/e2e-test/resources/pluginParameters.properties

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,23 +7,24 @@ host=ORACLE_HOST
77
port=ORACLE_PORT
88
username=ORACLE_USERNAME
99
password=ORACLE_PASSWORD
10+
connection.name=dummy
1011
outputSchema=[{"key":"ID","value":"decimal"},{"key":"LASTNAME","value":"string"}]
1112
datatypeColumns=(ID VARCHAR2(100) PRIMARY KEY, COL1 CHAR, COL2 CHAR(10), COL3 VARCHAR(3), COL4 VARCHAR2(3), \
1213
COL5 NCHAR, COL6 NCHAR(12), COL7 NVARCHAR2(12), COL8 CLOB, COL9 NCLOB, COL10 LONG, COL11 ROWID, COL12 NUMBER(4), \
1314
COL13 NUMBER(*), COL14 NUMBER(*,2), COL15 NUMBER(10,-3), COL16 NUMBER, COL17 DECIMAL(4), COL18 DECIMAL(*), \
1415
COL19 DECIMAL(*,2), COL20 DECIMAL(10,-3), COL21 DECIMAL, COL22 FLOAT, COL23 FLOAT(4), COL24 INTEGER, \
1516
COL25 DOUBLE PRECISION, COL26 REAL, COL27 SMALLINT, COL28 TIMESTAMP, COL29 TIMESTAMP(9), \
1617
COL30 TIMESTAMP WITH TIME ZONE, COL31 INTERVAL DAY(6) TO SECOND (5), COL32 INTERVAL YEAR(4) TO MONTH, COL33 DATE, \
17-
COL34 BINARY_FLOAT, COL35 BINARY_DOUBLE)
18+
COL34 BINARY_FLOAT, COL35 BINARY_DOUBLE, COL36 UROWID)
1819
datatypeColumnsList=(ID, COL1, COL2, COL3, COL4,COL5,COL6,COL7,COL8,COL9,COL10,COL11,COL12,COL13,COL14,COL15,COL16,\
19-
COL17,COL18,COL19,COL20,COL21,COL22,COL23,COL24,COL25,COL26,COL27,COL28,COL29,COL30,COL31,COL32,COL33,COL34,COL35)
20+
COL17,COL18,COL19,COL20,COL21,COL22,COL23,COL24,COL25,COL26,COL27,COL28,COL29,COL30,COL31,COL32,COL33,COL34,COL35,COL36)
2021
datatypeValues=VALUES ('USER1', 'M','ABCDEF','ABC','ABC','ä','ä½ å¥½ï¼?è¿?','ä½ å¥½ï¼?è¿?',\
2122
'This is a sample long data.\n','è¿?æ?¯ä¸?个é\u009D?常','48656C6C6F','AAAAaoAATAAABrXAAA',1234,1234.56789,\
2223
1234.56789,1234.56789,1234.56789,1234.56789,1234.56789,1234.56789,1234.56789,1234.56789,1234.5679,1234.5679,\
2324
1234.56789,1234.5679,1234.5679,1234.56789,TIMESTAMP'2023-01-01 2:00:00',TIMESTAMP'2023-01-01 2:00:00',\
2425
TIMESTAMP'2023-01-01 2:00:00 -08:00',TIMESTAMP '2001-09-03 12:47:00.000000'- TIMESTAMP '2001-09-03 13:13:00.000000',\
2526
INTERVAL '5-2' YEAR TO MONTH,TIMESTAMP '2023-01-01 00:00:00.000000',339999992740149460000,\
26-
34000000000000000000000000000000000000000)
27+
34000000000000000000000000000000000000000,'AAAHJYAAEAAAADyAAA')
2728
outputDatatypesSchema=[{"key":"ID","value":"string"},{"key":"COL1","value":"string"},{"key":"COL2","value":"string"},\
2829
{"key":"COL3","value":"string"},{"key":"COL4","value":"string"},{"key":"COL5","value":"string"},\
2930
{"key":"COL6","value":"string"},{"key":"COL7","value":"string"},{"key":"COL8","value":"string"},\
@@ -35,7 +36,8 @@ outputDatatypesSchema=[{"key":"ID","value":"string"},{"key":"COL1","value":"stri
3536
{"key":"COL24","value":"decimal"},{"key":"COL25","value":"double"},{"key":"COL26","value":"double"},\
3637
{"key":"COL27","value":"decimal"},{"key":"COL28","value":"datetime"},{"key":"COL29","value":"datetime"},\
3738
{"key":"COL30","value":"timestamp"},{"key":"COL31","value":"string"},{"key":"COL32","value":"string"},\
38-
{"key":"COL33","value":"datetime"},{"key":"COL34","value":"float"},{"key":"COL35","value":"double"}]
39+
{"key":"COL33","value":"datetime"},{"key":"COL34","value":"float"},{"key":"COL35","value":"double"},\
40+
{"key":"COL36","value":"string"}]
3941

4042
longColumns=(ID VARCHAR2(100) PRIMARY KEY, COL1 LONG, COL2 RAW(2), COL3 BLOB, COL4 CLOB, COL5 NCLOB, COL6 BFILE)
4143
longColumnsList=(ID,COL1,COL2,COL3,COL4,COL5,COL6)
@@ -93,6 +95,8 @@ numberOfSplits=2
9395
zeroValue=0
9496
splitByColumn=ID
9597
importQuery=where $CONDITIONS
98+
connectionArguments=queryTimeout=50
99+
transactionIsolationLevel=TRANSACTION_READ_COMMITTED
96100

97101
#bq properties
98102
projectId=cdf-athena

0 commit comments

Comments
 (0)