Skip to content

Commit 4fa723a

Browse files
authored
Cleanup getting started examples (#71)
1 parent cd9705d commit 4fa723a

File tree

33 files changed

+154
-285
lines changed

33 files changed

+154
-285
lines changed

finance-credit-card-chatbot/credit-card-analytics/creditcard-kafka/rewardsink.table.sql

Lines changed: 0 additions & 13 deletions
This file was deleted.

getting-started-examples/00_getting_started/README.md

Lines changed: 0 additions & 83 deletions
This file was deleted.

getting-started-examples/01_kafka_to_console/README.md

Lines changed: 4 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -3,21 +3,19 @@
33
This project demonstrates how to use [DataSQRL](https://datasqrl.com) to build a streaming pipeline that:
44

55
- Reads data from a kafka topic and prints output to console
6-
- Kafka is part of datasqrl package.
7-
8-
6+
- Kafka is part of the DataSQRL package.
97

108
## 🐳 Running DataSQRL
119

1210
Run the following command from the project root where your `package.json` and SQRL scripts reside:
1311

1412
```bash
15-
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:dev run -c package.json
13+
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:0.7.0 run -c package.json
1614
```
1715

1816
## Generate Data
1917

2018
* Go to `data-generator` folder
2119
```bash
22-
python3 send_kafka_avro_records.py ../kafka-source/contact.avsc data.jsonl contact localhost:9092
23-
```
20+
python3 send_kafka_avro_records.py ../kafka-source/contact.avsc data.jsonl contact localhost:9092
21+
```
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
11
IMPORT kafka-source.Contact;
22

33
-- Export the results to console (print)
4-
EXPORT Contact TO print.Contact;
4+
EXPORT Contact TO print.Contact;

getting-started-examples/01_kafka_to_console/package.json

Lines changed: 0 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,5 @@
1313
},
1414
"test-runner": {
1515
"create-topics": ["contact", "contact"]
16-
},
17-
"dependencies": {
18-
"metrics": {
19-
"folder": "kafka-source"
20-
}
2116
}
2217
}

getting-started-examples/02_kafka_to_kafka/README.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,16 +3,14 @@
33
This project demonstrates how to use [DataSQRL](https://datasqrl.com) to build a streaming pipeline that:
44

55
- Reads data from a kafka topic and writes to another kafka topic
6-
- Kafka is part of datasqrl package.
7-
8-
6+
- Kafka is part of the DataSQRL package.
97

108
## 🐳 Running DataSQRL
119

1210
Run the following command from the project root where your `package.json` and SQRL scripts reside:
1311

1412
```bash
15-
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:dev run -c package.json
13+
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:0.7.0 run -c package.json
1614
```
1715

1816
## Generate Data
@@ -25,4 +23,4 @@ docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:d
2523

2624
## Output
2725

28-
* Updated records should be generated in contactupdated topic.
26+
* Updated records should be generated in `contactupdated` topic.

getting-started-examples/02_kafka_to_kafka/kafka-test.sqrl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,4 +2,4 @@ IMPORT kafka-source.Contact as Contacts;
22

33
ContactsUpdated := SELECT firstname, lastname, last_updated FROM Contacts;
44

5-
EXPORT ContactsUpdated TO kafkasink.ContactUpdated;
5+
EXPORT ContactsUpdated TO kafka-sink.ContactUpdated;

getting-started-examples/02_kafka_to_kafka/package.json

Lines changed: 0 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -13,13 +13,5 @@
1313
},
1414
"test-runner": {
1515
"create-topics": ["contact", "contactupdated"]
16-
},
17-
"dependencies": {
18-
"kafkasource": {
19-
"folder": "kafka-source"
20-
},
21-
"kafkasink": {
22-
"folder": "kafka-sink"
23-
}
2416
}
2517
}

getting-started-examples/03_two_streams_kafka_to_kafka/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ This project demonstrates how to use [DataSQRL](https://datasqrl.com) to build a
1313
Run the following command from the project root where your `package.json` and SQRL scripts reside:
1414

1515
```bash
16-
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:dev run -c package.json
16+
docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:0.7.0 run -c package.json
1717
```
1818

1919
## Generate Data
@@ -32,4 +32,4 @@ docker run -it --rm -p 8888:8888 -p 9092:9092 -v $PWD:/build datasqrl/cmd:d
3232

3333
## Output
3434

35-
* Updated records should be generated in enrichedcontact topic.
35+
* Updated records should be generated in enrichedcontact topic.

getting-started-examples/03_two_streams_kafka_to_kafka/kafka-test.sqrl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,4 +7,4 @@ EnrichedContacts := SELECT c.id, c.firstname, c.lastname, o.orgname, c.last_upda
77
ON c.id = o.userid
88
AND c.last_updated BETWEEN o.last_updated - INTERVAL '30' SECOND AND o.last_updated + INTERVAL '30' SECOND;
99

10-
EXPORT EnrichedContacts TO kafkasink.EnrichedContact;
10+
EXPORT EnrichedContacts TO kafka-sink.EnrichedContact;

0 commit comments

Comments
 (0)