You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -22,11 +22,11 @@
22
22
23
23
---
24
24
25
-
Pyper is a comprehensive framework for concurrent and parallel data-processing, based on functional programming patterns. Used for 🌐**Data Collection**, 🔀**ETL Systems**, and general-purpose 🛠️ **Python Scripting**
25
+
Pyper is a flexible framework for concurrent and parallel data-processing, based on functional programming patterns. Used for 🔀**ETL Systems**, ⚙️**Data Microservices**, and 🌐 **Data Collection**
26
26
27
27
See the [Documentation](https://pyper-dev.github.io/pyper/)
28
28
29
-
Key features:
29
+
**Key features:**
30
30
31
31
* 💡**Intuitive API**: Easy to learn, easy to think about. Implements clean abstractions to seamlessly unify threaded, multiprocessed, and asynchronous work.
32
32
* 🚀 **Functional Paradigm**: Python functions are the building blocks of data pipelines. Let's you write clean, reusable code naturally.
@@ -99,7 +99,7 @@ if __name__ == "__main__":
99
99
asyncio.run(main())
100
100
```
101
101
102
-
Pyper provides an elegant abstraction of the execution of each function via `pyper.task`, allowing you to focus on building out the **logical** functions of your program. In the `main` function:
102
+
Pyper provides an elegant abstraction of the execution of each task, allowing you to focus on building out the **logical** functions of your program. In the `main` function:
103
103
104
104
*`pipeline` defines a function; this takes the parameters of its first task (`get_data`) and yields each output from its last task (`step3`)
105
105
* Tasks are piped together using the `|` operator (motivated by Unix's pipe operator) as a syntactic representation of passing inputs/outputs between tasks.
@@ -114,7 +114,7 @@ In the pipeline, we are executing three different types of work:
114
114
115
115
`task` acts as one intuitive API for unifying the execution of each different type of function.
116
116
117
-
Each task submits their outputs to the next task within the pipeline via queue-based data structures, which is the mechanism underpinning how concurrency and parallelism are achieved. See the [docs](https://pyper-dev.github.io/pyper/docs/UserGuide/BasicConcepts) for a breakdown of what a pipeline looks like under the hood.
117
+
Each task has workers that submit outputs to the next task within the pipeline via queue-based data structures; this is the mechanism underpinning how concurrency and parallelism are achieved. See the [docs](https://pyper-dev.github.io/pyper/docs/UserGuide/BasicConcepts) for a breakdown of what a pipeline looks like under the hood.
Copy file name to clipboardExpand all lines: docs/src/docs/UserGuide/ComposingPipelines.md
+7-1Lines changed: 7 additions & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -89,7 +89,13 @@ if __name__ == "__main__":
89
89
```
90
90
91
91
{: .info}
92
-
Pyper comes with fantastic IDE intellisense support which understands these operators, and will always show you which variables are `Pipeline` or `AsyncPipeline` objects; this also preserves type hints from your own functions, showing you the parameter and return type specs for each pipeline or consumer
92
+
Pyper comes with fantastic intellisense support which understands these operators and preserves parameter/return type hints from user-defined functions
0 commit comments