@@ -14,11 +14,11 @@ The following examples are for reference only. Prefer docs for the latest inform
14
14
15
15
## Features
16
16
17
- - 🚀 ** Simple and intuitive API** - Get started in minutes
18
- - 🔄 ** Streaming support** - Real-time text generation with typed events
19
- - 🛠️ ** Type safety** - Full type hints for better IDE support
20
- - 📦 ** Minimal dependencies** - Only what you need
21
- - 🐍 ** Python 3.7+** - Support for modern Python versions
17
+ - ** Simple and intuitive API** - Get started in minutes
18
+ - ** Streaming support** - Real-time text generation with typed events
19
+ - ** Type safety** - Full type hints for better IDE support
20
+ - ** Minimal dependencies** - Only what you need
21
+ - ** Python 3.7+** - Support for modern Python versions
22
22
23
23
## Installation
24
24
@@ -53,14 +53,14 @@ langbase_api_key = os.getenv("LANGBASE_API_KEY")
53
53
llm_api_key = os.getenv(" LLM_API_KEY" )
54
54
55
55
# Initialize the client
56
- lb = Langbase(api_key = langbase_api_key)
56
+ langbase = Langbase(api_key = langbase_api_key)
57
57
```
58
58
59
59
### 3. Generate text
60
60
61
61
``` python
62
62
# Simple generation
63
- response = lb .agent.run(
63
+ response = langbase .agent.run(
64
64
input = [{" role" : " user" , " content" : " Tell me about AI" }],
65
65
model = " openai:gpt-4.1-mini" ,
66
66
api_key = llm_api_key,
@@ -90,7 +90,7 @@ for content in runner.text_generator():
90
90
print (content, end = " " , flush = True )
91
91
```
92
92
93
- ### 5. Stream with typed events (Advanced) 🆕
93
+ ### 5. Stream with typed events (Advanced)
94
94
95
95
``` python
96
96
from langbase import StreamEventType, get_typed_runner
@@ -144,51 +144,51 @@ runner.process()
144
144
145
145
## Core Features
146
146
147
- ### 🔄 Pipes - AI Pipeline Execution
147
+ ### Pipes - AI Pipeline Execution
148
148
149
149
``` python
150
150
# List all pipes
151
- pipes = lb .pipes.list()
151
+ pipes = langbase .pipes.list()
152
152
153
153
# Run a pipe
154
- response = lb .pipes.run(
154
+ response = langbase .pipes.run(
155
155
name = " ai-agent" ,
156
156
messages = [{" role" : " user" , " content" : " Hello!" }],
157
157
variables = {" style" : " friendly" }, # Optional variables
158
158
stream = True , # Enable streaming
159
159
)
160
160
```
161
161
162
- ### 🧠 Memory - Persistent Context Storage
162
+ ### Memory - Persistent Context Storage
163
163
164
164
``` python
165
165
# Create a memory
166
- memory = lb .memories.create(
166
+ memory = langbase .memories.create(
167
167
name = " product-docs" ,
168
168
description = " Product documentation" ,
169
169
)
170
170
171
171
# Upload documents
172
- lb .memories.documents.upload(
172
+ langbase .memories.documents.upload(
173
173
memory_name = " product-docs" ,
174
174
document_name = " guide.pdf" ,
175
175
document = open (" guide.pdf" , " rb" ),
176
176
content_type = " application/pdf" ,
177
177
)
178
178
179
179
# Retrieve relevant context
180
- results = lb .memories.retrieve(
180
+ results = langbase .memories.retrieve(
181
181
query = " How do I get started?" ,
182
182
memory = [{" name" : " product-docs" }],
183
183
top_k = 3 ,
184
184
)
185
185
```
186
186
187
- ### 🤖 Agent - LLM Agent Execution
187
+ ### Agent - LLM Agent Execution
188
188
189
189
``` python
190
190
# Run an agent with tools
191
- response = lb .agent.run(
191
+ response = langbase .agent.run(
192
192
model = " openai:gpt-4" ,
193
193
messages = [{" role" : " user" , " content" : " Search for AI news" }],
194
194
tools = [{" type" : " function" , " function" : {... }}],
@@ -198,24 +198,24 @@ response = lb.agent.run(
198
198
)
199
199
```
200
200
201
- ### 🔧 Tools - Built-in Utilities
201
+ ### Tools - Built-in Utilities
202
202
203
203
``` python
204
204
# Chunk text for processing
205
- chunks = lb .chunker(
205
+ chunks = langbase .chunker(
206
206
content = " Long text to split..." ,
207
207
chunk_max_length = 1024 ,
208
208
chunk_overlap = 256 ,
209
209
)
210
210
211
211
# Generate embeddings
212
- embeddings = lb .embed(
212
+ embeddings = langbase .embed(
213
213
chunks = [" Text 1" , " Text 2" ],
214
214
embedding_model = " openai:text-embedding-3-small" ,
215
215
)
216
216
217
217
# Parse documents
218
- content = lb .parser(
218
+ content = langbase .parser(
219
219
document = open (" document.pdf" , " rb" ),
220
220
document_name = " document.pdf" ,
221
221
content_type = " application/pdf" ,
@@ -226,10 +226,9 @@ content = lb.parser(
226
226
227
227
Explore the [ examples] ( ./examples ) directory for complete working examples:
228
228
229
- - [ Generate text] ( ./examples/pipes/pipes.run.py )
230
- - [ Stream text with events] ( ./examples/pipes/pipes.run.typed-stream.py )
231
- - [ Work with memory] ( ./examples/memory/ )
232
229
- [ Agent with tools] ( ./examples/agent/ )
230
+ - [ Work with memory] ( ./examples/memory/ )
231
+ - [ Generate text] ( ./examples/pipes/pipes.run.py )
233
232
- [ Document processing] ( ./examples/parser/ )
234
233
- [ Workflow automation] ( ./examples/workflow/ )
235
234
@@ -243,9 +242,9 @@ We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) f
243
242
244
243
## Support
245
244
246
- - 📚 [ Documentation] ( https://langbase.com/docs )
247
- - 💬 [ Discord Community] ( https://langbase.com/discord )
248
- - 🐛 [ Issue Tracker] ( https://github.com/LangbaseInc/langbase-python-sdk/issues )
245
+ - [ Documentation] ( https://langbase.com/docs )
246
+ - [ Discord Community] ( https://langbase.com/discord )
247
+ - [ Issue Tracker] ( https://github.com/LangbaseInc/langbase-python-sdk/issues )
249
248
250
249
## License
251
250
0 commit comments