Skip to content

Commit 6dc967c

Browse files
committed
Release version 0.12.7
1 parent 76c9fa3 commit 6dc967c

File tree

34 files changed

+421
-142
lines changed

34 files changed

+421
-142
lines changed

docs-build/cookbook/instructor/advanced/custom_llm.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ $config = new LLMConfig(
3434
model: 'deepseek-chat',
3535
maxTokens: 128,
3636
httpClient: 'guzzle',
37-
providerType: LLMProviderType::OpenAICompatible,
37+
providerType: LLMProviderType::OpenAICompatible->value,
3838
);
3939

4040
// Get Instructor with the default client component overridden with your own

docs-build/cookbook/polyglot/llm_advanced/custom_llm.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ $config = new LLMConfig(
2828
model: 'deepseek-chat',
2929
maxTokens: 128,
3030
httpClient: 'guzzle',
31-
providerType: LLMProviderType::OpenAICompatible,
31+
providerType: LLMProviderType::OpenAICompatible->value,
3232
);
3333

3434
$answer = (new Inference)

docs-build/cookbook/polyglot/llm_extras/prompt_templates.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@ prompt templates using Twig, Blade or custom 'arrowpipe' template syntax.
1515
<?php
1616
require 'examples/boot.php';
1717

18-
use Cognesy\Addons\Prompt\Template;
1918
use Cognesy\Polyglot\LLM\Inference;
19+
use Cognesy\Utils\Template\Template;
2020
use Cognesy\Utils\Str;
2121

2222
// EXAMPLE 1: Define prompt template inline (don't use files) and use short syntax

docs-build/instructor/advanced/prompts.mdx

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ To get started, you can create and render a simple prompt defined in the bundled
116116

117117
```php
118118
<?php
119-
use Cognesy\Addons\Prompt\Template;
119+
use Cognesy\Utils\Template\Template;
120120

121121
// Basic example using "using->get->with" syntax
122122
$prompt = Template::using('demo-twig')->get('hello')->with(['name' => 'World']);
@@ -156,8 +156,7 @@ If you need to customize the configuration or set the template content directly,
156156

157157
```php
158158
<?php
159-
use Cognesy\Addons\Prompt\Data\TemplateEngineConfig;
160-
use Cognesy\Addons\Prompt\Enums\TemplateEngineType;
159+
use Cognesy\Utils\Template\Data\TemplateEngineConfig;use Cognesy\Utils\Template\Enums\TemplateEngineType;
161160

162161
// Setting custom configuration
163162
$config = new TemplateEngineConfig(
Lines changed: 89 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,89 @@
1-
# Work in progress
1+
---
2+
title: 'Quickstart'
3+
description: 'Start working with LLMs in under 5 minutes'
4+
---
5+
6+
This guide will help you get started with Polyglot in your PHP project in under 5 minutes.
7+
8+
For detailed setup instructions, see [Setup](setup).
9+
10+
11+
## Install Polyglot with Composer
12+
13+
Run following command in your terminal:
14+
15+
```bash
16+
composer require cognesy/instructor-polyglot
17+
```
18+
19+
20+
## Create and Run Example
21+
22+
### Step 1: Prepare your OpenAI API Key
23+
24+
In this example, we'll use OpenAI as the LLM provider. You can get it from the [OpenAI dashboard](https://platform.openai.com/).
25+
26+
### Step 2: Create a New PHP File
27+
28+
In your project directory, create a new PHP file `test-instructor.php`:
29+
30+
```php
31+
<?php
32+
require __DIR__ . '/vendor/autoload.php';
33+
34+
use Cognesy\Instructor\Instructor;
35+
36+
// Set up OpenAI API key
37+
$apiKey = 'your-openai-api-key';
38+
putenv("OPENAI_API_KEY=" . $apiKey);
39+
// WARNING: In real project you should set up API key in .env file.
40+
41+
// Step 1: Define target data structure(s)
42+
class City {
43+
public string $name;
44+
public string $country;
45+
public int $population;
46+
}
47+
48+
// Step 2: Use Instructor to run LLM inference
49+
$city = (new Instructor)->withConnection('openai')->respond(
50+
messages: 'What is the capital of France?',
51+
responseModel: City::class,
52+
);
53+
54+
var_dump($city);
55+
```
56+
57+
<Warning>
58+
You should never put your API keys directly in your real project code to avoid getting them compromised. Set them up in your .env file.
59+
</Warning>
60+
61+
### Step 3: Run the Example
62+
63+
Now, you can run the example:
64+
65+
```bash
66+
php test-instructor.php
67+
68+
# Output:
69+
# object(City)#1 (3) {
70+
# ["name"]=>
71+
# string(5) "Paris"
72+
# ["country"]=>
73+
# string(6) "France"
74+
# ["population"]=>
75+
# int(2148000)
76+
# }
77+
```
78+
79+
80+
## Next Steps
81+
82+
You can start using Instructor in your project right away after installation.
83+
84+
But it's recommended to publish configuration files and prompt templates to your project directory, so you can
85+
customize the library's behavior and use your own prompt templates.
86+
87+
You should also set up LLM provider API keys in your `.env` file instead of putting them directly in your code.
88+
89+
See [Setup Instructions](setup) for more details.

docs-build/mint.json

Lines changed: 72 additions & 72 deletions
Original file line numberDiff line numberDiff line change
@@ -149,6 +149,29 @@
149149
"cookbook/contributing"
150150
]
151151
},
152+
{
153+
"group": "API Providers Support",
154+
"pages": [
155+
"cookbook/instructor/api_support/a21",
156+
"cookbook/instructor/api_support/anthropic",
157+
"cookbook/instructor/api_support/azure_openai",
158+
"cookbook/instructor/api_support/cerebras",
159+
"cookbook/instructor/api_support/cohere",
160+
"cookbook/instructor/api_support/deepseek",
161+
"cookbook/instructor/api_support/fireworks",
162+
"cookbook/instructor/api_support/google_gemini",
163+
"cookbook/instructor/api_support/groq",
164+
"cookbook/instructor/api_support/minimaxi",
165+
"cookbook/instructor/api_support/mistralai",
166+
"cookbook/instructor/api_support/moonshotai",
167+
"cookbook/instructor/api_support/ollama",
168+
"cookbook/instructor/api_support/openai",
169+
"cookbook/instructor/api_support/openrouter",
170+
"cookbook/instructor/api_support/sambanova",
171+
"cookbook/instructor/api_support/togetherai",
172+
"cookbook/instructor/api_support/xai"
173+
]
174+
},
152175
{
153176
"group": "Basics",
154177
"pages": [
@@ -192,73 +215,6 @@
192215
"cookbook/instructor/troubleshooting/wiretap"
193216
]
194217
},
195-
{
196-
"group": "API Providers Support",
197-
"pages": [
198-
"cookbook/instructor/api_support/a21",
199-
"cookbook/instructor/api_support/anthropic",
200-
"cookbook/instructor/api_support/azure_openai",
201-
"cookbook/instructor/api_support/cerebras",
202-
"cookbook/instructor/api_support/cohere",
203-
"cookbook/instructor/api_support/deepseek",
204-
"cookbook/instructor/api_support/fireworks",
205-
"cookbook/instructor/api_support/google_gemini",
206-
"cookbook/instructor/api_support/groq",
207-
"cookbook/instructor/api_support/minimaxi",
208-
"cookbook/instructor/api_support/mistralai",
209-
"cookbook/instructor/api_support/moonshotai",
210-
"cookbook/instructor/api_support/ollama",
211-
"cookbook/instructor/api_support/openai",
212-
"cookbook/instructor/api_support/openrouter",
213-
"cookbook/instructor/api_support/sambanova",
214-
"cookbook/instructor/api_support/togetherai",
215-
"cookbook/instructor/api_support/xai"
216-
]
217-
},
218-
{
219-
"group": "Extras",
220-
"pages": [
221-
"cookbook/instructor/extras/complex_extraction",
222-
"cookbook/instructor/extras/complex_extraction_claude",
223-
"cookbook/instructor/extras/complex_extraction_cohere",
224-
"cookbook/instructor/extras/complex_extraction_gemini",
225-
"cookbook/instructor/extras/image_car_damage",
226-
"cookbook/instructor/extras/image_to_data",
227-
"cookbook/instructor/extras/image_to_data_anthropic",
228-
"cookbook/instructor/extras/image_to_data_gemini",
229-
"cookbook/instructor/extras/schema",
230-
"cookbook/instructor/extras/schema_dynamic",
231-
"cookbook/instructor/extras/transcription_to_tasks",
232-
"cookbook/instructor/extras/translate_ui_fields",
233-
"cookbook/instructor/extras/web_to_objects"
234-
]
235-
},
236-
{
237-
"group": "LLM Basics",
238-
"pages": [
239-
"cookbook/polyglot/llm_basics/llm",
240-
"cookbook/polyglot/llm_basics/llm_json",
241-
"cookbook/polyglot/llm_basics/llm_json_schema",
242-
"cookbook/polyglot/llm_basics/llm_md_json",
243-
"cookbook/polyglot/llm_basics/llm_tools"
244-
]
245-
},
246-
{
247-
"group": "LLM Advanced",
248-
"pages": [
249-
"cookbook/polyglot/llm_advanced/context_cache_llm",
250-
"cookbook/polyglot/llm_advanced/custom_llm",
251-
"cookbook/polyglot/llm_advanced/embeddings",
252-
"cookbook/polyglot/llm_advanced/parallel_calls",
253-
"cookbook/polyglot/llm_advanced/reasoning_content"
254-
]
255-
},
256-
{
257-
"group": "LLM Troubleshooting",
258-
"pages": [
259-
"cookbook/polyglot/llm_troubleshooting/http_debug"
260-
]
261-
},
262218
{
263219
"group": "LLM API Support",
264220
"pages": [
@@ -283,12 +239,21 @@
283239
]
284240
},
285241
{
286-
"group": "LLM Extras",
242+
"group": "Extras",
287243
"pages": [
288-
"cookbook/polyglot/llm_extras/chat_with_summary",
289-
"cookbook/polyglot/llm_extras/prompt_templates",
290-
"cookbook/polyglot/llm_extras/summary_with_llm",
291-
"cookbook/polyglot/llm_extras/tool_use"
244+
"cookbook/instructor/extras/complex_extraction",
245+
"cookbook/instructor/extras/complex_extraction_claude",
246+
"cookbook/instructor/extras/complex_extraction_cohere",
247+
"cookbook/instructor/extras/complex_extraction_gemini",
248+
"cookbook/instructor/extras/image_car_damage",
249+
"cookbook/instructor/extras/image_to_data",
250+
"cookbook/instructor/extras/image_to_data_anthropic",
251+
"cookbook/instructor/extras/image_to_data_gemini",
252+
"cookbook/instructor/extras/schema",
253+
"cookbook/instructor/extras/schema_dynamic",
254+
"cookbook/instructor/extras/transcription_to_tasks",
255+
"cookbook/instructor/extras/translate_ui_fields",
256+
"cookbook/instructor/extras/web_to_objects"
292257
]
293258
},
294259
{
@@ -336,6 +301,41 @@
336301
"cookbook/prompting/misc/component_reuse",
337302
"cookbook/prompting/misc/component_reuse_cot"
338303
]
304+
},
305+
{
306+
"group": "LLM Basics",
307+
"pages": [
308+
"cookbook/polyglot/llm_basics/llm",
309+
"cookbook/polyglot/llm_basics/llm_json",
310+
"cookbook/polyglot/llm_basics/llm_json_schema",
311+
"cookbook/polyglot/llm_basics/llm_md_json",
312+
"cookbook/polyglot/llm_basics/llm_tools"
313+
]
314+
},
315+
{
316+
"group": "LLM Advanced",
317+
"pages": [
318+
"cookbook/polyglot/llm_advanced/context_cache_llm",
319+
"cookbook/polyglot/llm_advanced/custom_llm",
320+
"cookbook/polyglot/llm_advanced/embeddings",
321+
"cookbook/polyglot/llm_advanced/parallel_calls",
322+
"cookbook/polyglot/llm_advanced/reasoning_content"
323+
]
324+
},
325+
{
326+
"group": "LLM Troubleshooting",
327+
"pages": [
328+
"cookbook/polyglot/llm_troubleshooting/http_debug"
329+
]
330+
},
331+
{
332+
"group": "LLM Extras",
333+
"pages": [
334+
"cookbook/polyglot/llm_extras/chat_with_summary",
335+
"cookbook/polyglot/llm_extras/prompt_templates",
336+
"cookbook/polyglot/llm_extras/summary_with_llm",
337+
"cookbook/polyglot/llm_extras/tool_use"
338+
]
339339
}
340340
],
341341
"footerSocials": {
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
- Moved

docs/instructor/advanced/prompts.mdx

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ To get started, you can create and render a simple prompt defined in the bundled
116116

117117
```php
118118
<?php
119-
use Cognesy\Addons\Prompt\Template;
119+
use Cognesy\Utils\Template\Template;
120120

121121
// Basic example using "using->get->with" syntax
122122
$prompt = Template::using('demo-twig')->get('hello')->with(['name' => 'World']);
@@ -156,8 +156,7 @@ If you need to customize the configuration or set the template content directly,
156156

157157
```php
158158
<?php
159-
use Cognesy\Addons\Prompt\Data\TemplateEngineConfig;
160-
use Cognesy\Addons\Prompt\Enums\TemplateEngineType;
159+
use Cognesy\Utils\Template\Data\TemplateEngineConfig;use Cognesy\Utils\Template\Enums\TemplateEngineType;
161160

162161
// Setting custom configuration
163162
$config = new TemplateEngineConfig(

docs/release-notes/v0.12.7.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,2 @@
1+
- Moved Cognesy/Addons/Prompt to Cognesy/Utils/Template to clarify purpose and remove cyclic dependency between src-addons and src-utils
2+
- Added Anthropic thinking traces support in LLM drivers (thinking data is now available directly in LLMResponse and LLMPartialResponse object properties)

examples/C05_LLMExtras/PromptTemplate/run.php

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,8 @@
1515
<?php
1616
require 'examples/boot.php';
1717

18-
use Cognesy\Addons\Prompt\Template;
1918
use Cognesy\Polyglot\LLM\Inference;
19+
use Cognesy\Utils\Template\Template;
2020
use Cognesy\Utils\Str;
2121

2222
// EXAMPLE 1: Define prompt template inline (don't use files) and use short syntax

0 commit comments

Comments
 (0)