You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+58Lines changed: 58 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -348,3 +348,61 @@ Flow in LazyLLM defines the data stream, describing how data is passed from one
348
348
1. You can easily combine, add, and replace various modules and components; the design of Flow makes adding new features simple and facilitates collaboration between different modules and even projects.
349
349
2. Through a standardized interface and data flow mechanism, Flow reduces the repetitive work developers face when handling data transfer and transformation. Developers can focus more on core business logic, thus improving overall development efficiency.
350
350
3. Some Flows support asynchronous processing and parallel execution, significantly enhancing response speed and system performance when dealing with large-scale data or complex tasks.
351
+
352
+
## Future Plans
353
+
354
+
### Timeline
355
+
V0.6 Expected to start from September 1st, lasting 3 months, with continuous small version releases in between, such as v0.6.1, v0.6.2
356
+
V0.7 Expected to start from December 1st, lasting 3 months, with continuous small version releases in between, such as v0.7.1, v0.7.2
357
+
358
+
### Feature Modules
359
+
RAG
360
+
- Engineering
361
+
- Integrate LazyRAG capabilities into LazyLLM (V0.6)
- ServerModule can be published as MCP service (v0.7)
380
+
- Integration of online sandbox services (v0.7)
381
+
382
+
Model Training and Inference
383
+
- Support OpenAI interface deployment and inference (V0.6)
384
+
- Unify fine-tuning and inference prompts (V0.7)
385
+
- Provide fine-tuning examples in Examples (V0.7)
386
+
- Integrate 2-3 prompt repositories, allowing direct selection of prompts from prompt repositories (V0.6)
387
+
- Support more intelligent model type judgment and inference framework selection, refactor and simplify auto-finetune framework selection logic (V0.6)
388
+
- Full-chain GRPO support (V0.7)
389
+
390
+
Documentation
391
+
- Complete API documentation, ensure every public interface has API documentation, with consistent documentation parameters and function parameters, and executable sample code (V0.6)
392
+
- Complete CookBook documentation, increase cases to 50, with comparisons to LangChain/LlamaIndex (code volume, speed, extensibility) (V0.6)
- Complete Learn documentation, first teach how to use large models; then teach how to build agents; then teach how to use workflows; finally teach how to build RAG (V0.6)
395
+
396
+
Quality
397
+
- Reduce CI time to within 10 minutes by mocking most modules (V0.6)
398
+
- Add daily builds, put high-time-consuming/token tasks in daily builds (V0.6)
399
+
400
+
Development, Deployment and Release
401
+
- Debug optimization (v0.7)
402
+
- Process monitoring [output + performance] (v0.7)
403
+
- Environment isolation and automatic environment setup for dependent training and inference frameworks (V0.6)
404
+
405
+
Ecosystem
406
+
- Promote LazyCraft open source (V0.6)
407
+
- Promote LazyRAG open source (V0.7)
408
+
- Upload code to 2 code hosting websites other than Github and strive for community collaboration (V0.6)
0 commit comments