|
| 1 | +# A Practical Look at Stacks, Queues, and Priority Queues in Python |
| 2 | + |
| 3 | +Sample code supplementing the tutorial on [Python queues](https://realpython.com/queue-in-python/) hosted on Real Python. |
| 4 | + |
| 5 | +## Installation |
| 6 | + |
| 7 | +To get started, create and activate a new virtual environment, and then install the required dependencies into it: |
| 8 | + |
| 9 | +```shell |
| 10 | +$ python3 -m venv venv/ --prompt=queue |
| 11 | +$ source venv/bin/activate |
| 12 | +(queue) $ python -m pip install -r requirements.txt -c constraints.txt |
| 13 | +``` |
| 14 | + |
| 15 | +## Usage |
| 16 | + |
| 17 | +### Queue Implementation |
| 18 | + |
| 19 | +Change directory to `src/` and run the interactive Python interpreter: |
| 20 | + |
| 21 | +```shell |
| 22 | +(queue) $ cd src/ |
| 23 | +(queue) $ python -q |
| 24 | +``` |
| 25 | + |
| 26 | +Then, import various queue data types from the `queues` module and start using them: |
| 27 | + |
| 28 | +```python |
| 29 | +>>> from queues import Queue, Stack, PriorityQueue |
| 30 | + |
| 31 | +>>> fifo, stack, heap = Queue(), Stack(), PriorityQueue() |
| 32 | +>>> for priority, element in enumerate(["1st", "2nd", "3rd"]): |
| 33 | +... fifo.enqueue(element) |
| 34 | +... stack.enqueue(element) |
| 35 | +... heap.enqueue_with_priority(priority, element) |
| 36 | + |
| 37 | +>>> for elements in zip(fifo, stack, heap): |
| 38 | +... print(elements) |
| 39 | +... |
| 40 | +('1st', '3rd', '3rd') |
| 41 | +('2nd', '2nd', '2nd') |
| 42 | +('3rd', '1st', '1st') |
| 43 | +``` |
| 44 | + |
| 45 | +### Graph Algorithms |
| 46 | + |
| 47 | +Change directory to `src/` and run the interactive Python interpreter: |
| 48 | + |
| 49 | +```shell |
| 50 | +(queue) $ cd src/ |
| 51 | +(queue) $ python -q |
| 52 | +``` |
| 53 | + |
| 54 | +Then, import various `graph` module members and start using them: |
| 55 | + |
| 56 | +```python |
| 57 | +>>> from graph import * |
| 58 | + |
| 59 | +>>> nodes, graph = load_graph("roadmap.dot", City.from_dict) |
| 60 | + |
| 61 | +>>> city1 = nodes["london"] |
| 62 | +>>> city2 = nodes["edinburgh"] |
| 63 | + |
| 64 | +>>> def distance(weights): |
| 65 | +... return float(weights["distance"]) |
| 66 | + |
| 67 | +>>> for city in dijkstra_shortest_path(graph, city1, city2, distance): |
| 68 | +... print(city.name) |
| 69 | +... |
| 70 | +City of London |
| 71 | +St Albans |
| 72 | +Coventry |
| 73 | +Birmingham |
| 74 | +Stoke-on-Trent |
| 75 | +Manchester |
| 76 | +Salford |
| 77 | +Preston |
| 78 | +Lancaster |
| 79 | +Carlisle |
| 80 | +Edinburgh |
| 81 | + |
| 82 | +>>> for city in shortest_path(graph, city1, city2): |
| 83 | +... print(city.name) |
| 84 | +... |
| 85 | +City of London |
| 86 | +Bristol |
| 87 | +Newport |
| 88 | +St Asaph |
| 89 | +Liverpool |
| 90 | +Preston |
| 91 | +Lancaster |
| 92 | +Carlisle |
| 93 | +Edinburgh |
| 94 | + |
| 95 | +>>> connected(graph, city1, city2) |
| 96 | +True |
| 97 | + |
| 98 | +>>> def is_twentieth_century(city): |
| 99 | +... return city.year and 1901 <= city.year <= 2000 |
| 100 | + |
| 101 | +>>> breadth_first_search(graph, city2, is_twentieth_century) |
| 102 | +City( |
| 103 | + name='Lancaster', |
| 104 | + country='England', |
| 105 | + year=1937, |
| 106 | + latitude=54.047, |
| 107 | + longitude=-2.801 |
| 108 | +) |
| 109 | + |
| 110 | +>>> depth_first_search(graph, city2, is_twentieth_century) |
| 111 | +City( |
| 112 | + name='Lancaster', |
| 113 | + country='England', |
| 114 | + year=1937, |
| 115 | + latitude=54.047, |
| 116 | + longitude=-2.801 |
| 117 | +) |
| 118 | +``` |
| 119 | + |
| 120 | +### Thread-Safe Queues |
| 121 | + |
| 122 | +Change directory to `src/` and run the script with optional parameters. For example: |
| 123 | + |
| 124 | +```shell |
| 125 | +(queue) $ cd src/ |
| 126 | +(queue) $ python thread_safe_queues.py --queue fifo \ |
| 127 | + --producers 3 \ |
| 128 | + --consumers 2 \ |
| 129 | + --producer-speed 1 \ |
| 130 | + --consumer-speed 1 |
| 131 | +``` |
| 132 | + |
| 133 | +**Parameters:** |
| 134 | + |
| 135 | +| Short Name | Long Name | Value | |
| 136 | +|-----------:|-------------------:|------------------------| |
| 137 | +| `-q` | `--queue` | `fifo`, `lifo`, `heap` | |
| 138 | +| `-p` | `--producers` | number | |
| 139 | +| `-c` | `--consumers` | number | |
| 140 | +| `-ps` | `--producer-speed` | number | |
| 141 | +| `-cs` | `--consumer-speed` | number | |
| 142 | + |
| 143 | +### Asynchronous Queues |
| 144 | + |
| 145 | +Change directory to `src/` and run the script with a mandatory URL and optional parameters: |
| 146 | + |
| 147 | +```shell |
| 148 | +(queue) $ cd src/ |
| 149 | +(queue) $ python async_queues.py http://localhost:8000/ --max-depth 2 \ |
| 150 | + --num-workers 3 |
| 151 | +``` |
| 152 | + |
| 153 | +**Parameters:** |
| 154 | + |
| 155 | +| Short Name | Long Name | Value | |
| 156 | +|-----------:|----------------:|--------| |
| 157 | +| `-d` | `--max-depth` | number | |
| 158 | +| `-w` | `--num-workers` | number | |
| 159 | + |
| 160 | +Note that to change between the available queue types, you'll need to edit your `main()` coroutine function: |
| 161 | + |
| 162 | +```python |
| 163 | +# async_queues.py |
| 164 | + |
| 165 | +# ... |
| 166 | + |
| 167 | +async def main(args): |
| 168 | + session = aiohttp.ClientSession() |
| 169 | + try: |
| 170 | + links = Counter() |
| 171 | + queue = asyncio.Queue() |
| 172 | + # queue = asyncio.LifoQueue() |
| 173 | + # queue = asyncio.PriorityQueue() |
| 174 | + |
| 175 | +# ... |
| 176 | +``` |
| 177 | + |
| 178 | +### Multiprocessing Queue |
| 179 | + |
| 180 | +Change directory to `src/` and run the script with a mandatory MD5 hash value and optional parameters: |
| 181 | + |
| 182 | +```shell |
| 183 | +(queue) $ cd src/ |
| 184 | +(queue) $ python async_queues.py a9d1cbf71942327e98b40cf5ef38a960 -m 6 -w 4 |
| 185 | +``` |
| 186 | + |
| 187 | +**Parameters:** |
| 188 | + |
| 189 | +| Short Name | Long Name | Value | |
| 190 | +|-----------:|----------------:|--------| |
| 191 | +| `-m` | `--max-length` | number | |
| 192 | +| `-w` | `--num-workers` | number | |
| 193 | + |
| 194 | +The maximum length determines the maximum number of characters in a text to guess. If you skip the number of workers, then the script will create as many of them as the number of CPU cores detected. |
| 195 | + |
| 196 | +### Message Brokers |
| 197 | + |
| 198 | +#### RabbitMQ |
| 199 | + |
| 200 | +Start a RabbitMQ broker with Docker: |
| 201 | + |
| 202 | +```shell |
| 203 | +$ docker run -it --rm --name rabbitmq -p 5672:5672 rabbitmq |
| 204 | +``` |
| 205 | + |
| 206 | +Open separate terminal windows, activate your virtual environment, change directory to `message_brokers/rabbitmq/`, and run your producer and consumer scripts: |
| 207 | + |
| 208 | +```shell |
| 209 | +(queue) $ cd message_brokers/rabbitmq/ |
| 210 | +(queue) $ python producer.py |
| 211 | +(queue) $ python consumer.py |
| 212 | +``` |
| 213 | + |
| 214 | +You can have as many producers and consumers as you like. |
| 215 | + |
| 216 | +#### Redis |
| 217 | + |
| 218 | +Start a Redis server with Docker: |
| 219 | + |
| 220 | +```shell |
| 221 | +$ docker run -it --rm --name redis -p 6379:6379 redis |
| 222 | +``` |
| 223 | + |
| 224 | +Open separate terminal windows, activate your virtual environment, change directory to `message_brokers/redis/`, and run your publisher and subscriber scripts: |
| 225 | + |
| 226 | +```shell |
| 227 | +(queue) $ cd message_brokers/redis/ |
| 228 | +(queue) $ python publisher.py |
| 229 | +(queue) $ python subscriber.py |
| 230 | +``` |
| 231 | + |
| 232 | +You can have as many publishers and subscribers as you like. |
| 233 | + |
| 234 | +#### Apache Kafka |
| 235 | + |
| 236 | +Change directory to `message_brokers/kafka/` and start an Apache Kafka cluster with Docker Compose: |
| 237 | + |
| 238 | +```shell |
| 239 | +$ cd message_brokers/kafka/ |
| 240 | +$ docker-compose up |
| 241 | +``` |
| 242 | + |
| 243 | +Open separate terminal windows, activate your virtual environment, change directory to `message_brokers/kafka/`, and run your producer and consumer scripts: |
| 244 | + |
| 245 | +```shell |
| 246 | +(queue) $ cd message_brokers/kafka/ |
| 247 | +(queue) $ python producer.py |
| 248 | +(queue) $ python consumer.py |
| 249 | +``` |
| 250 | + |
| 251 | +You can have as many producers and consumers as you like. |
0 commit comments