You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-**Adapter**: TanStack AI's interface to that provider
91
91
-**Model**: The specific model (GPT-4, Claude, etc.)
92
92
93
-
The old `providerOptions` were tied to the *model*, not the provider. Changing from `gpt-4` to `gpt-3.5-turbo` changes those options. So we renamed them:
93
+
The old `providerOptions` were tied to the _model_, not the provider. Changing from `gpt-4` to `gpt-3.5-turbo` changes those options. So we renamed them:
94
94
95
95
```ts
96
96
chat({
97
-
adapter: openaiText("gpt-4"),
97
+
adapter: openaiText('gpt-4'),
98
98
modelOptions: {
99
-
text: {}
100
-
}
99
+
text: {},
100
+
},
101
101
})
102
102
```
103
103
@@ -108,19 +108,19 @@ Settings like `temperature` work across providers. Our other modalities already
108
108
```ts
109
109
generateImage({
110
110
adapter,
111
-
numberOfImages: 3
111
+
numberOfImages: 3,
112
112
})
113
113
```
114
114
115
115
So we brought chat in line:
116
116
117
117
```ts
118
118
chat({
119
-
adapter: openaiText("gpt-4"),
119
+
adapter: openaiText('gpt-4'),
120
120
modelOptions: {
121
-
text: {}
121
+
text: {},
122
122
},
123
-
temperature: 0.6
123
+
temperature: 0.6,
124
124
})
125
125
```
126
126
@@ -149,6 +149,7 @@ chat({
149
149
**Standard Schema support.** We're dropping the Zod constraint for tools and structured outputs. Bring your own schema validation library.
150
150
151
151
**On the roadmap:**
152
+
152
153
- Middleware
153
154
- Tool hardening
154
155
- Headless UI library for AI components
@@ -166,4 +167,4 @@ We're confident in this direction. We think you'll like it too.
166
167
167
168
---
168
169
169
-
*Curious how we got here? Read [The `ai()` Function That Almost Was](/blog/tanstack-ai-the-ai-function-postmortem)—a post-mortem on the API we loved, built, and had to kill.*
170
+
_Curious how we got here? Read [The `ai()` Function That Almost Was](/blog/tanstack-ai-the-ai-function-postmortem)—a post-mortem on the API we loved, built, and had to kill._
@@ -72,7 +72,7 @@ To match the theme, we called it `aiOptions`. It would constrain everything to t
72
72
73
73
```ts
74
74
const opts =aiOptions({
75
-
adapter: openaiText("gpt-4")
75
+
adapter: openaiText('gpt-4'),
76
76
})
77
77
78
78
ai(opts)
@@ -118,7 +118,7 @@ When we finally asked the LLMs directly what they thought of the API, they were
118
118
119
119
We used agents to do the implementation work. That hid the struggle from us.
120
120
121
-
If we'd been writing the code by hand, we would have *felt* the challenge of wrestling with the types. That probably would have stopped the idea early.
121
+
If we'd been writing the code by hand, we would have _felt_ the challenge of wrestling with the types. That probably would have stopped the idea early.
122
122
123
123
LLMs won't bark when you tell them to do crazy stuff. They won't criticize your designs unless you ask them to. They just try. And try. And eventually produce something that technically works but shouldn't exist.
124
124
@@ -141,8 +141,8 @@ Before landing on separate functions, we tried one more thing: an adapter with s
141
141
142
142
```ts
143
143
const adapter =openai()
144
-
adapter.image("model")
145
-
adapter.text("model")
144
+
adapter.image('model')
145
+
adapter.text('model')
146
146
```
147
147
148
148
Looks nicer. Feels more unified. Same problem—still bundles everything.
@@ -154,12 +154,12 @@ We could have done custom bundling in TanStack Start to strip unused parts, but
154
154
Separate functions. `chat()`, `generateImage()`, `generateSpeech()`, `generateTranscription()`.
155
155
156
156
```ts
157
-
import { chat } from"@tanstack/ai"
158
-
import { openaiText } from"@tanstack/ai-openai"
157
+
import { chat } from'@tanstack/ai'
158
+
import { openaiText } from'@tanstack/ai-openai'
159
159
160
160
chat({
161
-
adapter: openaiText("gpt-4"),
162
-
temperature: 0.6
161
+
adapter: openaiText('gpt-4'),
162
+
temperature: 0.6,
163
163
})
164
164
```
165
165
@@ -185,4 +185,4 @@ We loved the `ai()` API. We built it. We had to kill it. That's how it goes some
185
185
186
186
---
187
187
188
-
*Ready to try what we shipped instead? Read [TanStack AI Alpha 2: Every Modality, Better APIs, Smaller Bundles](/blog/tanstack-ai-alpha-2).*
188
+
_Ready to try what we shipped instead? Read [TanStack AI Alpha 2: Every Modality, Better APIs, Smaller Bundles](/blog/tanstack-ai-alpha-2)._
0 commit comments