Instructor is a Gleam library for structured prompting with Large Language Models. It converts LLM text outputs into validated data structures, enabling seamless integration between AI and traditional Gleam applications.
- Structured Prompting: Define response schemas and get validated structured data from LLMs
- Multiple LLM Providers: Support for OpenAI, Anthropic, Gemini, Groq, Ollama, and more
- Validation & Retry Logic: Automatic retry with error feedback when responses don't match schemas
- Streaming Support: Handle partial and array streaming responses
- Type Safe: Full Gleam type safety for LLM interactions
import instructor
import instructor/types
// Create configuration
let config = instructor.default_config()
// Create a simple response model
let response_model = instructor.string_response_model("Extract the sentiment as positive, negative, or neutral")
// Make a chat completion
let messages = [instructor.user_message("I love Gleam programming!")]
case instructor.chat_completion(
config,
response_model,
messages,
None, // model (uses default)
None, // temperature
None, // max_tokens
None, // mode
None, // max_retries
None, // validation_context
) {
types.Success(result) -> io.println("Sentiment: " <> result)
types.ValidationError(errors) -> io.println("Validation failed: " <> string.join(errors, ", "))
types.AdapterError(error) -> io.println("API error: " <> error)
}
Response models define the structure and validation for LLM outputs:
// Simple string response
let string_model = instructor.string_response_model("Description of the field")
// Integer response
let int_model = instructor.int_response_model("A number between 1-10")
// Boolean response
let bool_model = instructor.bool_response_model("True if positive sentiment")
Create messages for conversation:
let messages = [
instructor.system_message("You are a helpful assistant."),
instructor.user_message("What is the capital of France?"),
]
Different modes for LLM interaction:
Tools
- OpenAI function calling (most reliable)Json
- JSON modeJsonSchema
- Structured outputs with schemaMdJson
- JSON in markdown code blocks
Configure adapters in your application:
import instructor/types
let config = instructor.InstructorConfig(
adapter: openai_adapter(),
default_model: "gpt-4o-mini",
default_max_retries: 3,
)
This is a port of the Elixir Instructor library to Gleam. The current implementation includes:
- ✅ Core types and data structures
- ✅ JSON schema generation
- ✅ Validation using
gleam/dynamic/decode
- ✅ Adapter pattern for multiple LLMs
- ✅ OpenAI, Anthropic, Gemini, and Ollama adapters
- ✅ HTTP client implementation
- ✅ Basic test suite
- 🚧 Streaming support (basic implementation, needs more testing and features)
Add to your gleam.toml
:
[dependencies]
gleam_stdlib = "~> 0.34"
gleam_http = "~> 4.1"
gleam_httpc = "~> 5.0"
gleam_json = "~> 3.0"
MIT License - see LICENSE for details.
Contributions welcome! Please see CONTRIBUTING.md for guidelines.