You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+140Lines changed: 140 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -130,6 +130,79 @@ assert all(isinstance(u, User) for u in decoded_users)
130
130
131
131
See [examples/pydantic_usage.py](examples/pydantic_usage.py) for more examples.
132
132
133
+
### Response Structure Templates for LLM Prompts
134
+
135
+
TOON provides a powerful feature to generate response structure templates that can be included in LLM prompts. This tells the model exactly what format to return data in, without needing to provide examples with actual data.
136
+
137
+
```python
138
+
from toon import generate_structure
139
+
140
+
# Define the expected response structure
141
+
schema = {
142
+
"name": "name of the person",
143
+
"age": "age of the person",
144
+
"occupation": "job description of the person"
145
+
}
146
+
147
+
# Generate the structure template
148
+
structure = generate_structure(schema)
149
+
print(structure)
150
+
# Output:
151
+
# name: <name of the person>
152
+
# age: <age of the person>
153
+
# occupation: <job description of the person>
154
+
155
+
# Use in your LLM prompt
156
+
prompt =f"""Extract person information from the text and return it in this format:
0 commit comments