You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+18-17Lines changed: 18 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -23,20 +23,25 @@ Here's a simple CLI demo app using Instructor to extract structured data from te
23
23
24
24
This repository is a monorepo containing all Instructor's components (required and optional). It hosts all that you need to work with LLMs via Instructor.
25
25
26
-
Individual components are distributed also as standalone packages that can be used independently.
26
+
Individual components are also distributed as standalone packages that can be used independently.
27
27
28
28

29
29
30
-
Links to read-only repositories of the individual packages:
31
-
-[instructor-struct](https://github.com/cognesy/instructor-struct) - get dev friendly structured outputs from LLMs
32
-
-[instructor-polyglot](https://github.com/cognesy/instructor-polyglot) - use single API for inference and embeddings across most of LLM providers, easily switch between them (e.g., develop on Ollama, switch to Groq in production)
33
-
-[instructor-http-client](https://github.com/cognesy/instructor-http-client) - easily switch between underlying HTTP client libraries (out-of-the-box support for Guzzle, Symfony, Laravel)
34
-
-[instructor-aux](https://github.com/cognesy/instructor-aux) - external tools and integrations, e.g. used by Instructor examples
35
-
-[instructor-addons](https://github.com/cognesy/instructor-addons) - extra capabilities and common LLM-related problem solutions
36
-
-[instructor-utils](https://github.com/cognesy/instructor-utils) - common utility classes used by Instructor packages
37
-
-[instructor-setup](https://github.com/cognesy/instructor-setup) - CLI tool for publishing Instructor config files in your app
38
-
-[instructor-hub](https://github.com/cognesy/instructor-hub) - CLI tool for browsing and running Instructor examples
39
-
-[instructor-tell](https://github.com/cognesy/instructor-tell) - CLI tool for executing LLM prompts in your terminal
30
+
Links to read-only repositories of the standalone package distributions:
31
+
32
+
-[instructor-addons](https://github.com/cognesy/instructor-addons) - extra capabilities and common LLM-related problem solutions
33
+
-[instructor-aux](https://github.com/cognesy/instructor-aux) - external tools and integrations, e.g. used by Instructor examples
-[instructor-http-client](https://github.com/cognesy/instructor-http-client) - easily switch between underlying HTTP client libraries (out-of-the-box support for Guzzle, Symfony, Laravel)
36
+
-[instructor-hub](https://github.com/cognesy/instructor-hub) - CLI tool for browsing and running Instructor examples
37
+
-[instructor-polyglot](https://github.com/cognesy/instructor-polyglot) - use single API for inference and embeddings across most of LLM providers, easily switch between them (e.g., develop on Ollama, switch to Groq in production)
38
+
-[instructor-setup](https://github.com/cognesy/instructor-setup) - CLI tool for publishing Instructor config files in your app
39
+
-[instructor-struct](https://github.com/cognesy/instructor-struct) - get dev friendly structured outputs from LLMs
40
+
-[instructor-tell](https://github.com/cognesy/instructor-tell) - CLI tool for executing LLM prompts in your terminal
41
+
-[instructor-templates](https://github.com/cognesy/instructor-templates) - text and chat template tools used by Instructor, support Twig, Blade and ArrowPipe formats
42
+
-[instructor-utils](https://github.com/cognesy/instructor-utils) - common utility classes used by Instructor packages
43
+
44
+
> NOTE: If you are just starting to use Instructor, I recommend using the `instructor-php` package. It contains all the required components and is the easiest way to get started with the library.
40
45
41
46
42
47
@@ -92,10 +97,11 @@ Links to read-only repositories of the individual packages:
92
97
93
98
- Developer friendly LLM context caching for reduced costs and faster inference (for Anthropic models)
94
99
- Developer friendly data extraction from images (for OpenAI, Anthropic and Gemini models)
100
+
- Generate vector embeddings using APIs of multiple supported LLM providers
95
101
96
102
### Documentation and examples
97
103
98
-
- Learn more from growing documentation and 50+ cookbooks
104
+
- Learn more from growing documentation and 100+ cookbooks
99
105
100
106
101
107
@@ -958,11 +964,6 @@ Additional dependencies are required for some extras:
0 commit comments