You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
5b. Run the command for eventhub ```python integration-tests/run-integration-test.py --cloud_provider_name=azure --dbr_version=11.3.x-scala2.12 --source=eventhub --dbfs_path=dbfs:/tmp/DLT-META/ --eventhub_name=iot --eventhub_secrets_scope_name=eventhubs_creds --eventhub_namespace=int_test-standard --eventhub_port=9093 --eventhub_producer_accesskey_name=producer ----eventhub_consumer_accesskey_name=consumer```
27
27
28
-
For eventhub integration tests, the following are the prerequisites:
29
-
1. Needs eventhub instance running
30
-
2. Using Databricks CLI, Create databricks secrets scope for eventhub keys
31
-
3. Using Databricks CLI, Create databricks secrets to store producer and consumer keys using the scope created in step 2
28
+
For eventhub integration tests, the following are the prerequisites:
29
+
1. Needs eventhub instance running
30
+
2. Using Databricks CLI, Create databricks secrets scope for eventhub keys
31
+
3. Using Databricks CLI, Create databricks secrets to store producer and consumer keys using the scope created in step 2
32
32
33
-
Following are the mandatory arguments for running EventHubs integration test
34
-
1. Provide your eventhub topic : --eventhub_name
35
-
2. Provide eventhub namespace : --eventhub_namespace
36
-
3. Provide eventhub port : --eventhub_port
37
-
4. Provide databricks secret scope name : ----eventhub_secrets_scope_name
38
-
5. Provide eventhub producer access key name : --eventhub_producer_accesskey_name
39
-
6. Provide eventhub access key name : --eventhub_consumer_accesskey_name
33
+
Following are the mandatory arguments for running EventHubs integration test
34
+
1. Provide your eventhub topic : --eventhub_name
35
+
2. Provide eventhub namespace : --eventhub_namespace
36
+
3. Provide eventhub port : --eventhub_port
37
+
4. Provide databricks secret scope name : ----eventhub_secrets_scope_name
38
+
5. Provide eventhub producer access key name : --eventhub_producer_accesskey_name
39
+
6. Provide eventhub access key name : --eventhub_consumer_accesskey_name
40
40
41
41
42
42
5c. Run the command for kafka ```python3 integration-tests/run-integration-test.py --cloud_provider_name=aws --dbr_version=11.3.x-scala2.12 --source=kafka --dbfs_path=dbfs:/tmp/DLT-META/ --kafka_topic_name=dlt-meta-integration-test --kafka_broker=host:9092```
43
43
44
-
For kafka integration tests, the following are the prerequisites:
45
-
1. Needs kafka instance running
44
+
For kafka integration tests, the following are the prerequisites:
45
+
1. Needs kafka instance running
46
46
47
-
Following are the mandatory arguments for running EventHubs integration test
48
-
1. Provide your kafka topic name : --kafka_topic_name
49
-
2. Provide kafka_broker : --kafka_broker
47
+
Following are the mandatory arguments for running EventHubs integration test
48
+
1. Provide your kafka topic name : --kafka_topic_name
49
+
2. Provide kafka_broker : --kafka_broker
50
50
51
51
6. Once finished integration output file will be copied locally to
0 commit comments