Skip to content

Commit a4a601e

Browse files
Pathway-DevManul from Pathway
andcommitted
Daily Pathway examples refresh
Co-authored-by: Manul from Pathway <[email protected]> GitOrigin-RevId: 83fa0bf184321be57b463dba7aa1e223a1cc86aa
1 parent 5050fbb commit a4a601e

File tree

8 files changed

+76
-76
lines changed

8 files changed

+76
-76
lines changed

examples/notebooks/showcases/live-data-jupyter.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -459,7 +459,7 @@
459459
"source": [
460460
"## Jupyter Notebooks & Streaming Data in Production\n",
461461
"\n",
462-
"Congratulations! You have successfully built a live data streaming pipeline with useful data visualisations and real-time alerts, right from a Jupyter notebook \ud83d\ude04\n",
462+
"Congratulations! You have succesfully built a live data streaming pipeline with useful data visualisations and real-time alerts, right from a Jupyter notebook \ud83d\ude04\n",
463463
"\n",
464464
"This is just a taste of what is possible. If you're interested in diving deeper and building a production-grade data science pipeline all the way from data exploration to deployment, you may want to check out the full-length [From Jupyter to Deploy](/developers/user-guide/deployment/from-jupyter-to-deploy) tutorial.\n",
465465
"\n",
@@ -499,4 +499,4 @@
499499
},
500500
"nbformat": 4,
501501
"nbformat_minor": 5
502-
}
502+
}

examples/notebooks/showcases/mistral_adaptive_rag_question_answering.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -405,7 +405,7 @@
405405
"lines_to_next_cell": 2
406406
},
407407
"source": [
408-
"#### 4. Local LLM Deployment\n",
408+
"#### 4. Local LLM Deployement\n",
409409
"Due to its size and performance we decided to run the `Mistral 7B` Local Language Model. We deploy it as a service running on GPU, using `Ollama`.\n",
410410
"\n",
411411
"In order to run local LLM, refer to these steps:\n",
@@ -626,4 +626,4 @@
626626
},
627627
"nbformat": 4,
628628
"nbformat_minor": 5
629-
}
629+
}

examples/notebooks/tutorials/alert-deduplication.ipynb

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -173,7 +173,7 @@
173173
"id": "7",
174174
"metadata": {},
175175
"source": [
176-
"To track the maximum value, we could write `input.groupby().reduce(max=pw.reducers.max(input.value))`. Here we want to keep track also *when* this maximum occurred, therefore we use the `argmax_rows` utility function."
176+
"To track the maximum value, we could write `input.groupby().reduce(max=pw.reducers.max(input.value))`. Here we want to keep track also *when* this maximum occured, therefore we use the `argmax_rows` utility function."
177177
]
178178
},
179179
{
@@ -242,7 +242,7 @@
242242
"id": "14",
243243
"metadata": {},
244244
"source": [
245-
"Now we can send the alerts to e.g. Slack. We can do it similarly as in the [realtime log monitoring tutorial](/developers/templates/etl/realtime-log-monitoring#scenario-2-sending-the-alert-to-slack) by using `pw.io.subscribe`.\n",
245+
"Now we can send the alerts to e.g. Slack. We can do it similarily as in the [realtime log monitoring tutorial](/developers/templates/etl/realtime-log-monitoring#scenario-2-sending-the-alert-to-slack) by using `pw.io.subscribe`.\n",
246246
"\n",
247247
"Here, for testing purposes, instead of sending an alert, we will store the accepted maxima in the list."
248248
]
@@ -279,7 +279,7 @@
279279
"id": "17",
280280
"metadata": {},
281281
"source": [
282-
"Let's run the program. Since the stream we defined is bounded (and we set high `input_rate` in the `generate_custom_stream`), the call to `pw.run` will finish quickly. However, in most usecases, you will be streaming data (e.g. from kafka) indefinitely."
282+
"Let's run the program. Since the stream we defined is bounded (and we set high `input_rate` in the `generate_custom_stream`), the call to `pw.run` will finish quickly. Hovever, in most usecases, you will be streaming data (e.g. from kafka) indefinitely."
283283
]
284284
},
285285
{
@@ -386,4 +386,4 @@
386386
},
387387
"nbformat": 4,
388388
"nbformat_minor": 5
389-
}
389+
}

examples/notebooks/tutorials/asynctransformer.ipynb

Lines changed: 57 additions & 57 deletions
Original file line numberDiff line numberDiff line change
@@ -162,11 +162,11 @@
162162
"output_type": "stream",
163163
"text": [
164164
" | value | ret | __time__ | __diff__\n",
165-
"^Z3QWT29... | 2 | 3 | 1766986321736 | 1\n",
166-
"^3CZ78B4... | 2 | 3 | 1766986321738 | 1\n",
167-
"^YYY4HAB... | 6 | 7 | 1766986322136 | 1\n",
168-
"^3HN31E1... | 6 | 7 | 1766986322138 | 1\n",
169-
"^X1MXHYY... | 12 | 13 | 1766986322738 | 1\n"
165+
"^Z3QWT29... | 2 | 3 | 1767073083628 | 1\n",
166+
"^3CZ78B4... | 2 | 3 | 1767073083630 | 1\n",
167+
"^YYY4HAB... | 6 | 7 | 1767073084030 | 1\n",
168+
"^3HN31E1... | 6 | 7 | 1767073084032 | 1\n",
169+
"^X1MXHYY... | 12 | 13 | 1767073084630 | 1\n"
170170
]
171171
}
172172
],
@@ -410,12 +410,12 @@
410410
"output_type": "stream",
411411
"text": [
412412
" | group | value | ret | __time__ | __diff__\n",
413-
"^Z3QWT29... | 2 | 1 | 2 | 1766986326318 | 1\n",
414-
"^Z3QWT29... | 2 | 1 | 2 | 1766986326518 | -1\n",
415-
"^Z3QWT29... | 2 | 3 | 4 | 1766986326518 | 1\n",
416-
"^YYY4HAB... | 1 | 5 | 6 | 1766986326718 | 1\n",
417-
"^YYY4HAB... | 1 | 5 | 6 | 1766986326720 | -1\n",
418-
"^YYY4HAB... | 1 | 2 | 3 | 1766986326720 | 1\n"
413+
"^Z3QWT29... | 2 | 1 | 2 | 1767073088216 | 1\n",
414+
"^Z3QWT29... | 2 | 1 | 2 | 1767073088416 | -1\n",
415+
"^Z3QWT29... | 2 | 3 | 4 | 1767073088416 | 1\n",
416+
"^YYY4HAB... | 1 | 5 | 6 | 1767073088618 | 1\n",
417+
"^YYY4HAB... | 1 | 5 | 6 | 1767073088620 | -1\n",
418+
"^YYY4HAB... | 1 | 2 | 3 | 1767073088620 | 1\n"
419419
]
420420
}
421421
],
@@ -537,18 +537,18 @@
537537
"output_type": "stream",
538538
"text": [
539539
" | group | value | ret | __time__ | __diff__\n",
540-
"^Z3QWT29... | 2 | 1 | 2 | 1766986327442 | 1\n",
541-
"^3HN31E1... | 4 | 3 | 4 | 1766986327442 | 1\n",
542-
"^Z3QWT29... | 2 | 1 | 2 | 1766986327542 | -1\n",
543-
"^3HN31E1... | 4 | 3 | 4 | 1766986327542 | -1\n",
544-
"^Z3QWT29... | 2 | 4 | 5 | 1766986327542 | 1\n",
545-
"^3HN31E1... | 4 | 2 | 3 | 1766986327542 | 1\n",
546-
"^YYY4HAB... | 1 | 5 | 6 | 1766986327642 | 1\n",
547-
"^3CZ78B4... | 3 | 1 | 2 | 1766986327642 | 1\n",
548-
"^YYY4HAB... | 1 | 5 | 6 | 1766986327644 | -1\n",
549-
"^3CZ78B4... | 3 | 1 | 2 | 1766986327644 | -1\n",
550-
"^YYY4HAB... | 1 | 2 | 3 | 1766986327644 | 1\n",
551-
"^3CZ78B4... | 3 | 2 | 3 | 1766986327644 | 1\n"
540+
"^Z3QWT29... | 2 | 1 | 2 | 1767073089336 | 1\n",
541+
"^3HN31E1... | 4 | 3 | 4 | 1767073089336 | 1\n",
542+
"^Z3QWT29... | 2 | 1 | 2 | 1767073089436 | -1\n",
543+
"^3HN31E1... | 4 | 3 | 4 | 1767073089436 | -1\n",
544+
"^Z3QWT29... | 2 | 4 | 5 | 1767073089436 | 1\n",
545+
"^3HN31E1... | 4 | 2 | 3 | 1767073089436 | 1\n",
546+
"^YYY4HAB... | 1 | 5 | 6 | 1767073089534 | 1\n",
547+
"^3CZ78B4... | 3 | 1 | 2 | 1767073089534 | 1\n",
548+
"^YYY4HAB... | 1 | 5 | 6 | 1767073089536 | -1\n",
549+
"^3CZ78B4... | 3 | 1 | 2 | 1767073089536 | -1\n",
550+
"^YYY4HAB... | 1 | 2 | 3 | 1767073089536 | 1\n",
551+
"^3CZ78B4... | 3 | 2 | 3 | 1767073089536 | 1\n"
552552
]
553553
}
554554
],
@@ -581,18 +581,18 @@
581581
"output_type": "stream",
582582
"text": [
583583
" | group | value | ret | __time__ | __diff__\n",
584-
"^Z3QWT29... | 2 | 1 | 2 | 1766986327854 | 1\n",
585-
"^3CZ78B4... | 3 | 1 | 2 | 1766986327856 | 1\n",
586-
"^3CZ78B4... | 3 | 1 | 2 | 1766986327954 | -1\n",
587-
"^3CZ78B4... | 3 | 2 | 3 | 1766986327954 | 1\n",
588-
"^3HN31E1... | 4 | 3 | 4 | 1766986328056 | 1\n",
589-
"^3HN31E1... | 4 | 3 | 4 | 1766986328058 | -1\n",
590-
"^3HN31E1... | 4 | 2 | 3 | 1766986328058 | 1\n",
591-
"^Z3QWT29... | 2 | 1 | 2 | 1766986328156 | -1\n",
592-
"^Z3QWT29... | 2 | 4 | 5 | 1766986328156 | 1\n",
593-
"^YYY4HAB... | 1 | 5 | 6 | 1766986328254 | 1\n",
594-
"^YYY4HAB... | 1 | 5 | 6 | 1766986328256 | -1\n",
595-
"^YYY4HAB... | 1 | 2 | 3 | 1766986328256 | 1\n"
584+
"^Z3QWT29... | 2 | 1 | 2 | 1767073089774 | 1\n",
585+
"^3CZ78B4... | 3 | 1 | 2 | 1767073089776 | 1\n",
586+
"^3CZ78B4... | 3 | 1 | 2 | 1767073089874 | -1\n",
587+
"^3CZ78B4... | 3 | 2 | 3 | 1767073089874 | 1\n",
588+
"^3HN31E1... | 4 | 3 | 4 | 1767073089974 | 1\n",
589+
"^3HN31E1... | 4 | 3 | 4 | 1767073089976 | -1\n",
590+
"^3HN31E1... | 4 | 2 | 3 | 1767073089976 | 1\n",
591+
"^Z3QWT29... | 2 | 1 | 2 | 1767073090076 | -1\n",
592+
"^Z3QWT29... | 2 | 4 | 5 | 1767073090076 | 1\n",
593+
"^YYY4HAB... | 1 | 5 | 6 | 1767073090174 | 1\n",
594+
"^YYY4HAB... | 1 | 5 | 6 | 1767073090176 | -1\n",
595+
"^YYY4HAB... | 1 | 2 | 3 | 1767073090176 | 1\n"
596596
]
597597
}
598598
],
@@ -635,18 +635,18 @@
635635
"output_type": "stream",
636636
"text": [
637637
" | group | value | ret | __time__ | __diff__\n",
638-
"^YYY4HAB... | 1 | 5 | 6 | 1766986328892 | 1\n",
639-
"^Z3QWT29... | 2 | 1 | 2 | 1766986328892 | 1\n",
640-
"^3CZ78B4... | 3 | 1 | 2 | 1766986328892 | 1\n",
641-
"^3HN31E1... | 4 | 3 | 4 | 1766986328892 | 1\n",
642-
"^YYY4HAB... | 1 | 5 | 6 | 1766986328894 | -1\n",
643-
"^Z3QWT29... | 2 | 1 | 2 | 1766986328894 | -1\n",
644-
"^3CZ78B4... | 3 | 1 | 2 | 1766986328894 | -1\n",
645-
"^3HN31E1... | 4 | 3 | 4 | 1766986328894 | -1\n",
646-
"^YYY4HAB... | 1 | 2 | 3 | 1766986328894 | 1\n",
647-
"^Z3QWT29... | 2 | 4 | 5 | 1766986328894 | 1\n",
648-
"^3CZ78B4... | 3 | 2 | 3 | 1766986328894 | 1\n",
649-
"^3HN31E1... | 4 | 2 | 3 | 1766986328894 | 1\n"
638+
"^YYY4HAB... | 1 | 5 | 6 | 1767073090774 | 1\n",
639+
"^Z3QWT29... | 2 | 1 | 2 | 1767073090774 | 1\n",
640+
"^3CZ78B4... | 3 | 1 | 2 | 1767073090774 | 1\n",
641+
"^3HN31E1... | 4 | 3 | 4 | 1767073090774 | 1\n",
642+
"^YYY4HAB... | 1 | 5 | 6 | 1767073090776 | -1\n",
643+
"^Z3QWT29... | 2 | 1 | 2 | 1767073090776 | -1\n",
644+
"^3CZ78B4... | 3 | 1 | 2 | 1767073090776 | -1\n",
645+
"^3HN31E1... | 4 | 3 | 4 | 1767073090776 | -1\n",
646+
"^YYY4HAB... | 1 | 2 | 3 | 1767073090776 | 1\n",
647+
"^Z3QWT29... | 2 | 4 | 5 | 1767073090776 | 1\n",
648+
"^3CZ78B4... | 3 | 2 | 3 | 1767073090776 | 1\n",
649+
"^3HN31E1... | 4 | 2 | 3 | 1767073090776 | 1\n"
650650
]
651651
}
652652
],
@@ -690,16 +690,16 @@
690690
"output_type": "stream",
691691
"text": [
692692
" | group | value | ret | __time__ | __diff__\n",
693-
"^Z3QWT29... | 2 | 1 | 2 | 1766986329308 | 1\n",
694-
"^3HN31E1... | 4 | 3 | 4 | 1766986329308 | 1\n",
695-
"^Z3QWT29... | 2 | 1 | 2 | 1766986329408 | -1\n",
696-
"^3HN31E1... | 4 | 3 | 4 | 1766986329408 | -1\n",
697-
"^YYY4HAB... | 1 | 5 | 6 | 1766986329508 | 1\n",
698-
"^3CZ78B4... | 3 | 1 | 2 | 1766986329508 | 1\n",
699-
"^YYY4HAB... | 1 | 5 | 6 | 1766986329510 | -1\n",
700-
"^3CZ78B4... | 3 | 1 | 2 | 1766986329510 | -1\n",
701-
"^YYY4HAB... | 1 | 2 | 3 | 1766986329510 | 1\n",
702-
"^3CZ78B4... | 3 | 2 | 3 | 1766986329510 | 1\n"
693+
"^Z3QWT29... | 2 | 1 | 2 | 1767073091190 | 1\n",
694+
"^3HN31E1... | 4 | 3 | 4 | 1767073091190 | 1\n",
695+
"^Z3QWT29... | 2 | 1 | 2 | 1767073091290 | -1\n",
696+
"^3HN31E1... | 4 | 3 | 4 | 1767073091290 | -1\n",
697+
"^YYY4HAB... | 1 | 5 | 6 | 1767073091388 | 1\n",
698+
"^3CZ78B4... | 3 | 1 | 2 | 1767073091388 | 1\n",
699+
"^YYY4HAB... | 1 | 5 | 6 | 1767073091390 | -1\n",
700+
"^3CZ78B4... | 3 | 1 | 2 | 1767073091390 | -1\n",
701+
"^YYY4HAB... | 1 | 2 | 3 | 1767073091390 | 1\n",
702+
"^3CZ78B4... | 3 | 2 | 3 | 1767073091390 | 1\n"
703703
]
704704
}
705705
],

examples/notebooks/tutorials/consistency.ipynb

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -323,7 +323,7 @@
323323
"name": "stderr",
324324
"output_type": "stream",
325325
"text": [
326-
"INFO:pathway_engine.connectors.monitoring:subscribe-0: Done writing 0 entries, time 1766986342882. Current batch writes took: 0 ms. All writes so far took: 0 ms.\n"
326+
"INFO:pathway_engine.connectors.monitoring:subscribe-0: Done writing 0 entries, time 1767073104640. Current batch writes took: 0 ms. All writes so far took: 0 ms.\n"
327327
]
328328
},
329329
{

examples/notebooks/tutorials/declarative_vs_imperative.ipynb

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@
8181
"```\n",
8282
"we would expect three \"finished\" chunks: `(0,1,2)`, `(3,4,5,6)`, `(7,8)` and one unfinished chunk `(9,...)`.\n",
8383
"\n",
84-
"One way to do this would be imperative style: go through rows one-by-one in order storing current chunk in a state and emitting it whenever `flag` is equal to True, while clearing the state.\n",
84+
"One way to do this would be imperative style: go through rows one-by-one in order storing current chunk in a state and emiting it whenever `flag` is equal to True, while clearing the state.\n",
8585
"Even though, its not recommended approach, let's see how to code it in Pathway."
8686
]
8787
},
@@ -181,7 +181,7 @@
181181
"source": [
182182
"Instead of manually managing state and control flow, Pathway allows you to define such logic using declarative constructs like `sort`, `iterate`, `groupby`. The result is a clear and concise pipeline that emits chunks of event times splitting the flag, showcasing the power and readability of declarative data processing.\n",
183183
"\n",
184-
"In the following, we tell Pathway to propagate the starting time of each chunk across the rows. This is done by declaring a simple local rule: take the starting time of a chunk from previous row or use current event time. This rule is then iterated until fixed-point, so that the information is spread until all rows know the starting time of their chunk.\n",
184+
"In the following, we tell Pathway to propagate the starting time of each chunk across the rows. This is done by declaring a simple local rule: take the starting time of a chunk from previous row or use current event time. This rule is then iterated until fixed-point, so that the information is spreaded until all rows know the starting time of their chunk.\n",
185185
"\n",
186186
"Then we can just group rows by starting time of the chunk to get a table of chunks."
187187
]
@@ -389,4 +389,4 @@
389389
},
390390
"nbformat": 4,
391391
"nbformat_minor": 5
392-
}
392+
}

examples/notebooks/tutorials/json_type.ipynb

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@
130130
"text": [
131131
" | key | data\n",
132132
"^X1MXHYY... | 1 | {\"author\": {\"id\": 1, \"name\": \"Haruki Murakami\"}, \"books\": [{\"title\": \"Norwegian Wood\", \"year\": 1987}, {\"title\": \"Kafka on the Shore\", \"year\": 2002, \"category\": \"Literary Fiction\"}]}\n",
133-
"^YYY4HAB... | 2 | {\"author\": {\"id\": 2, \"name\": \"Stanis\\u0142aw Lem\"}, \"books\": [{\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"}, {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"}]}\n",
133+
"^YYY4HAB... | 2 | {\"author\": {\"id\": 2, \"name\": \"Stanis\u0142aw Lem\"}, \"books\": [{\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"}, {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"}]}\n",
134134
"^Z3QWT29... | 3 | {\"author\": {\"id\": 3, \"name\": \"William Shakespeare\"}, \"books\": [{\"title\": \"Hamlet\", \"year\": 1603, \"category\": \"Tragedy\"}, {\"title\": \"Macbeth\", \"year\": 1623, \"category\": \"Tragedy\"}]}\n"
135135
]
136136
}
@@ -190,7 +190,7 @@
190190
"text": [
191191
" | author | books\n",
192192
"^X1MXHYY... | \"Haruki Murakami\" | [{\"title\": \"Norwegian Wood\", \"year\": 1987}, {\"title\": \"Kafka on the Shore\", \"year\": 2002, \"category\": \"Literary Fiction\"}]\n",
193-
"^YYY4HAB... | \"Stanis\\u0142aw Lem\" | [{\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"}, {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"}]\n",
193+
"^YYY4HAB... | \"Stanis\u0142aw Lem\" | [{\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"}, {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"}]\n",
194194
"^Z3QWT29... | \"William Shakespeare\" | [{\"title\": \"Hamlet\", \"year\": 1603, \"category\": \"Tragedy\"}, {\"title\": \"Macbeth\", \"year\": 1623, \"category\": \"Tragedy\"}]\n"
195195
]
196196
}
@@ -222,7 +222,7 @@
222222
"text": [
223223
" | author | title | category\n",
224224
"^X1MXHYY... | \"Haruki Murakami\" | \"Norwegian Wood\" | \"Uncategorized\"\n",
225-
"^YYY4HAB... | \"Stanis\\u0142aw Lem\" | \"Solaris\" | \"Science Fiction\"\n",
225+
"^YYY4HAB... | \"Stanis\u0142aw Lem\" | \"Solaris\" | \"Science Fiction\"\n",
226226
"^Z3QWT29... | \"William Shakespeare\" | \"Hamlet\" | \"Tragedy\"\n"
227227
]
228228
}
@@ -292,8 +292,8 @@
292292
"^X1MQZF8... | {\"title\": \"Kafka on the Shore\", \"year\": 2002, \"category\": \"Literary Fiction\"} | \"Haruki Murakami\"\n",
293293
"^Z3QHRW2... | {\"title\": \"Macbeth\", \"year\": 1623, \"category\": \"Tragedy\"} | \"William Shakespeare\"\n",
294294
"^X1MGYPB... | {\"title\": \"Norwegian Wood\", \"year\": 1987} | \"Haruki Murakami\"\n",
295-
"^YYYA47A... | {\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"} | \"Stanis\\u0142aw Lem\"\n",
296-
"^YYY18MS... | {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"} | \"Stanis\\u0142aw Lem\"\n"
295+
"^YYYA47A... | {\"title\": \"Solaris\", \"year\": 1961, \"category\": \"Science Fiction\"} | \"Stanis\u0142aw Lem\"\n",
296+
"^YYY18MS... | {\"title\": \"The Cyberiad\", \"year\": 1967, \"category\": \"Science Fiction\"} | \"Stanis\u0142aw Lem\"\n"
297297
]
298298
}
299299
],

examples/notebooks/tutorials/rag-evaluations.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1613,7 +1613,7 @@
16131613
"\n",
16141614
"Always structure your responses in the following format:\n",
16151615
"Relevant contexts: [Write the relevant parts of the context for given question]\n",
1616-
"Answer: [Detailed response to the user's question that is grounded by the facts you listed]\n",
1616+
"Answer: [Detailed reponse to the user's question that is grounded by the facts you listed]\n",
16171617
"\n",
16181618
"If you don't know the answer, just say that you don't know.\n",
16191619
"\n",
@@ -1757,4 +1757,4 @@
17571757
},
17581758
"nbformat": 4,
17591759
"nbformat_minor": 5
1760-
}
1760+
}

0 commit comments

Comments
 (0)