dPrompts enables teams to perform distributed, bulk LLM operations locally using Ollama, which is cost-effective and works on most laptops with an integrated GPU.
-
Run the installer script:
curl -fsSL https://raw.githubusercontent.com/HexmosTech/dPrompts/main/install.sh | bashThis will:
- Download and install the latest
dprbinary to/usr/local/bin - Copy
.dprompts.tomlto your home directory (if present in the current directory) - Check/install Ollama and the required model
- Start the Ollama server if not already running
- Download and install the latest
-
Configuration:
- Place your configuration file as
.dprompts.tomlin your home directory ($HOME/.dprompts.toml).
- Place your configuration file as
make workeror
dpr workermake clientor manually:
dpr client --args='{"prompt":"Why is the sky blue?"}' --metadata='{"type":"manpage","category":"science"}'To enqueue multiple jobs at once from a JSON file:
dpr client --bulk-from-file=queue_items.jsonEach job in the JSON file should follow this structure:
[
{
"base_prompt": "<common prompt shared by all subtasks>", // optional
"sub_tasks": [
{
"prompt": "<subtask-specific prompt>",
"schema": { /* schema for this subtask */ }, // optional
"metadata": {
"subtask_name": "<subtask identifier>"
}
},
{
"prompt": "<another subtask prompt>",
"schema": { /* schema */ },
"metadata": {
"subtask_name": "<subtask identifier>"
}
}
/* more subtasks... */
]
},
{
"base_prompt": "<common prompt for another job>", // optional
"sub_tasks": [
{
"prompt": "<subtask prompt>",
"schema": { /* schema */ }
/* metadata can be omitted */
}
/* more subtasks... */
]
}
]
-
base_prompt: Optional prompt shared by all subtasks in the job. It is a common context that helps improve caching and execution speed when running multiple related subtasks together. -
sub_tasks: A list of subtasks. Each subtask can include:prompta prompt specific to this subtaskschema(optional) — schema defining expected output for this subtaskmetadata(optional) — extra information such as group name or subtask identifier
The queue command provides operations to inspect and manage jobs in dprompts, including viewing, counting, clearing, and inspecting failed or completed jobs.
dpr queue [command] [flags]| Subcommand | Description |
|---|---|
view |
View queued jobs. Use -n or --number to limit how many jobs to display. |
count |
Count the total number of queued jobs. |
clear |
Clear all queued jobs. Prompts for confirmation before deleting. |
failed-attempts |
View jobs that have failed attempts. Use -n or --number to limit the display. |
completed |
Operations related to completed jobs, with further subcommands: count, first, last. |
View the last 10 queued jobs:
dpr queue viewCount the number of queued jobs:
dpr queue countClear all queued jobs (with confirmation):
dpr queue clearView the first 5 completed jobs:
dpr queue completed first -n 5Count completed jobs:
dpr queue completed countView jobs with failed attempts (last 20):
dpr queue failed-attempts -n 20The view command allows you to inspect dprompts results.
dpr view [flags]| Flag | Description |
|---|---|
-h, --help |
Show help for the view command |
-n, --number int |
Number of results to display (default: 10) |
The export command allows you to export dprompts results to files. You can control the output directory, format, and which results to include.
dpr export [flags]| Flag | Description | Default |
|---|---|---|
--dry-run |
Show what would be exported without actually writing files | false |
--from-date string |
Export results created after this date (format: YYYY-MM-DD) |
1 day before |
--full-export |
Export all results, ignores --from-date |
false |
--out string |
Directory to save exported files | ./dprompts_exports |
--overwrite |
Overwrite existing exported files in the output directory | false |
-h, --help |
Show help for the export command | - |
Export results created after 2025-12-01:
dpr export --from-date 2025-12-01Export all results, ignoring date:
dpr export --full-exportDry-run to see what would be exported:
dpr export --dry-runExport to a custom folder and overwrite existing files:
dpr export --out ./my_exports --overwrite-
Run Ollama server:
ollama serve
-
Pull a model:
ollama pull gemma2:2b
-
List available models:
ollama list
-
Test if Ollama is running:
curl http://localhost:11434/api/chat -d '{ "model": "gemma2:2b", "messages": [ { "role": "user", "content": "Why is the sky blue?" } ], "stream": false }'
-
Stop Ollama server (Ctrl+C if running in foreground): Press
Ctrl+Cin the terminal runningollama serve. -
Kill Ollama server running in background:
pkill ollama
- The
.dprompts.tomlfile must be placed in your home directory. - You can customize job arguments and metadata using the
--argsand--metadataflags (as JSON). - The worker will process jobs and store results in the configured PostgreSQL database.
- PostgreSQL Storage Details:
dprompt_results— stores the results of processed jobs.