Skip to content

Commit 5aa2406

Browse files
author
Matt Howlett
authored
Added venv info to example README.md (#1099)
* Added venv info to example README.md * review feedback * remove python setup.py build
1 parent f7e09e1 commit 5aa2406

File tree

3 files changed

+41
-6
lines changed

3 files changed

+41
-6
lines changed

.gitignore

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -27,3 +27,5 @@ tests/docker/conf/tls/*
2727
.idea
2828
tmp-KafkaCluster
2929
.venv
30+
venv_test
31+
venv_examples

examples/README.md

Lines changed: 38 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
The scripts in this directory provide code examples using Confluent's Python client:
22

3-
* [adminapi.py](adminapi.py): collection of Kafka Admin API operations
3+
* [adminapi.py](adminapi.py): Collection of Kafka Admin API operations
44
* [asyncio_example.py](asyncio_example.py): AsyncIO webserver with Kafka producer
5-
* [avro-cli.py](avro-cli.py): produces Avro messages with Confluent Schema Registry and then reads them back again
6-
* [consumer.py](consumer.py): reads messages from a Kafka topic
7-
* [producer.py](producer.py): reads lines from stdin and sends them to Kafka
8-
* [eos-transactions.py](eos-transactions.py): transactional producer with exactly once semantics (EOS)
5+
* [avro-cli.py](avro-cli.py): Produces Avro messages with Confluent Schema Registry and then reads them back again
6+
* [consumer.py](consumer.py): Reads messages from a Kafka topic
7+
* [producer.py](producer.py): Reads lines from stdin and sends them to Kafka
8+
* [eos-transactions.py](eos-transactions.py): Transactional producer with exactly once semantics (EOS)
99
* [avro_producer.py](avro_producer.py): SerializingProducer with AvroSerializer
1010
* [avro_consumer.py](avro_consumer.py): DeserializingConsumer with AvroDeserializer
1111
* [json_producer.py](json_producer.py): SerializingProducer with JsonSerializer
@@ -20,3 +20,36 @@ Additional examples for [Confluent Cloud](https://www.confluent.io/confluent-clo
2020

2121
* [confluent_cloud.py](confluent_cloud.py): produces messages to Confluent Cloud and then reads them back again
2222
* [confluentinc/examples](https://github.com/confluentinc/examples/tree/master/clients/cloud/python): integrates Confluent Cloud and Confluent Cloud Schema Registry
23+
24+
## venv setup
25+
26+
It's usually a good idea to install Python dependencies in a virtual environment to avoid
27+
conflicts between projects.
28+
29+
To setup a venv with the latest release version of confluent-kafka and dependencies of all examples installed:
30+
31+
```
32+
$ python3 -m venv venv_examples
33+
$ source venv_examples/bin/activate
34+
$ cd examples
35+
$ pip install -r requirements.txt
36+
```
37+
38+
To setup a venv that uses the current source tree version of confluent_kafka, you
39+
need to have a C compiler and librdkafka installed
40+
([from a package](https://github.com/edenhill/librdkafka#installing-prebuilt-packages), or
41+
[from source](https://github.com/edenhill/librdkafka#build-from-source)). Then:
42+
43+
```
44+
$ python3 -m venv venv_examples
45+
$ source venv_examples/bin/activate
46+
$ python setup.py develop
47+
$ cd examples
48+
$ pip install -r requirements.txt
49+
```
50+
51+
When you're finished with the venv:
52+
53+
```
54+
$ deactivate
55+
```

examples/requirements.txt

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
confluent-kafka[examples]
1+
confluent-kafka
22
fastapi
33
pydantic
44
uvicorn

0 commit comments

Comments
 (0)