You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs-build/polyglot/overview.mdx
+33-34Lines changed: 33 additions & 34 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -14,17 +14,30 @@ The core philosophy behind Polyglot is to create a consistent, provider-agnostic
14
14
15
15
Polyglot was developed as part of the Instructor for PHP library, which focuses on structured outputs from LLMs, but can also be used as a standalone library for general LLM interactions.
16
16
17
+
18
+
19
+
17
20
## Key Features
18
21
19
-
### Unified API
22
+
### Unified LLM API
20
23
21
24
Polyglot's primary feature is its unified API that works across multiple LLM providers:
22
25
23
-
- Consistent interface for making inference requests
26
+
- Consistent interface for making inference or embedding requests
24
27
- Common message format across all providers
25
28
- Standardized response handling
26
29
- Unified error handling
27
30
31
+
32
+
### Framework-Agnostic
33
+
34
+
Polyglot is designed to work with any PHP framework or even in plain PHP applications. It does not depend on any specific framework, making it easy to integrate into existing projects.
35
+
36
+
- Compatible with Laravel, Symfony, CodeIgniter, and others
37
+
- Can be used in CLI scripts or web applications
38
+
- Lightweight and easy to install
39
+
40
+
28
41
### Comprehensive Provider Support
29
42
30
43
Polyglot supports a wide range of LLM providers, including:
@@ -79,7 +92,23 @@ The library is built with extensibility in mind:
79
92
- Event system for request/response monitoring
80
93
- Ability to add custom providers
81
94
82
-
## Supported LLM Providers
95
+
96
+
97
+
## Use Cases
98
+
99
+
Polyglot is a good choice for a variety of use cases:
100
+
101
+
-**Applications requiring LLM provider flexibility**: Switch between providers based on cost, performance, or feature needs
102
+
-**Multi-environment deployments**: Use different LLM providers in development, staging, and production
103
+
-**Redundancy and fallback**: Implement fallback strategies when a provider is unavailable
104
+
-**Hybrid approaches**: Combine different providers for different tasks based on their strengths
105
+
-**Local + cloud development**: Use local models (via Ollama) for development and cloud providers for production
106
+
107
+
108
+
109
+
## Supported Providers
110
+
111
+
### Inference Providers
83
112
84
113
Polyglot currently supports the following LLM providers for chat completion:
85
114
@@ -102,7 +131,7 @@ Polyglot currently supports the following LLM providers for chat completion:
102
131
-**Together**: Together AI hosted models
103
132
-**xAI**: xAI's Grok models
104
133
105
-
##Supported Embeddings Providers
134
+
###Embeddings Providers
106
135
107
136
For embeddings generation, Polyglot supports:
108
137
@@ -113,33 +142,3 @@ For embeddings generation, Polyglot supports:
113
142
-**Mistral**: Mistral embedding models
114
143
-**Ollama**: Self-hosted embedding models
115
144
-**OpenAI**: OpenAI embeddings
116
-
117
-
## Use Cases
118
-
119
-
Polyglot is ideal for a variety of use cases:
120
-
121
-
-**Applications requiring LLM provider flexibility**: Switch between providers based on cost, performance, or feature needs
122
-
-**Multi-environment deployments**: Use different LLM providers in development, staging, and production
123
-
-**Redundancy and fallback**: Implement fallback strategies when a provider is unavailable
124
-
-**Hybrid approaches**: Combine different providers for different tasks based on their strengths
125
-
-**Local + cloud development**: Use local models (via Ollama) for development and cloud providers for production
126
-
127
-
## Getting Started
128
-
129
-
To start using Polyglot, you'll need to:
130
-
131
-
1. Install the library via Composer
132
-
2. Configure your LLM provider credentials
133
-
3. Create your first inference request
134
-
135
-
Basic example:
136
-
137
-
```php
138
-
<?php
139
-
use Cognesy\Polyglot\LLM\Inference;
140
-
141
-
// Simple text generation with default provider
142
-
$answer = Inference::text('What is the capital of France?');
Copy file name to clipboardExpand all lines: docs-build/polyglot/setup.mdx
+13-2Lines changed: 13 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -5,22 +5,31 @@ description: 'Setup of Polyglot in your PHP project'
5
5
6
6
This chapter will guide you through the initial steps of setting up and using Polyglot in your PHP project. We'll cover installation and configuration to get you up and running quickly.
7
7
8
+
9
+
10
+
8
11
## Installation
9
12
10
-
Polyglot is currently distributed as part of the Instructor PHP package. You can install it using Composer:
13
+
You can install it using Composer:
11
14
12
15
```bash
13
-
composer require cognesy/instructor-php
16
+
composer require cognesy/instructor-polyglot
14
17
```
15
18
16
19
This will install Polyglot along with its dependencies.
17
20
21
+
22
+
> NOTE: Polyglot is distributed as part of the Instructor PHP package, so if you have it installed, you don't need to install Polyglot separately.
23
+
18
24
## Requirements
19
25
20
26
- PHP 8.2 or higher
21
27
- Composer
22
28
- Valid API keys for at least one supported LLM provider
0 commit comments