You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can use it as a middleware to intercept requests,
68
68
or to provide a custom fetch implementation for e.g. testing.
69
69
70
+
## Prompt Management
71
+
72
+
The `@github/models/prompt` sub-module provides a powerful way to integrate with AI SDK methods like `generateText` and `generateObject` using your `prompt.yml` files.
73
+
Prompt YAML files are designed to create reusable artifacts that integrate with GitHub's suite of AI tools. Check out the [Models tab](https://github.com/github/models-ai-sdk/models).
74
+
75
+
### Example
76
+
77
+
Create a `.prompt.yml` file:
78
+
79
+
```yaml
80
+
name: teacher
81
+
description: An elementary school teacher who explains concepts simply
82
+
model: openai/gpt-4o
83
+
modelParameters:
84
+
temperature: 0.7
85
+
maxTokens: 500
86
+
messages:
87
+
- role: system
88
+
content: You're an elementary school teacher who loves making learning fun.
89
+
- role: user
90
+
content: Explain {{subject}} in exactly {{sentences}} sentences for a 10-year-old.
91
+
```
92
+
93
+
Use `createTextPrompt` for text-based responses:
94
+
95
+
```ts
96
+
import {readFile} from 'node:fs/promises'
97
+
import {parseYAML} from 'confbox/yaml'
98
+
import {createTextPrompt} from '@github/models/prompt'
The GitHub Models provider uses the [GitHub Models Inference API](https://docs.github.com/en/rest/models/inference?apiVersion=2022-11-28#run-an-inference-request).
0 commit comments