You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a simple Docker image which enables infinite possibilities for novel workflows by combining Dockerized Tools, Markdown, and the LLM of your choice.
20
+
21
+
## Markdown is the language
22
+
23
+
Humans already speak it. So do LLM's. This software allows you to write complex workflows in a markdown files, and then run them with your own LLM in your editor or terminal...or any environment, thanks to Docker.
24
+
25
+
## Dockerized Tools
26
+

27
+
28
+
OpenAI API compatiable LLM's already support tool calling. We believe these tools could just be Docker images. Some of the benefits using Docker based on our [research](https://www.linkedin.com/newsletters/docker-labs-genai-7204877599427194882/) are enabling the LLM to:
29
+
- take more complex actions
30
+
- get more context with fewer tokens
31
+
- work across a wider range of environments
32
+
- operate in a sandboxed environment
33
+
34
+
## Conversation *Loop*
35
+
The conversation loop is the core of each workflow. Tool results, agent responses, and of course, the markdown prompts, are all passed through the loop. If an agent sees an error, it will try running the tool with different parameters, or even different tools until it gets the right result.
36
+
37
+
## Multi-Model Agents
38
+
Each prompt can be configured to be run with different LLM models, or even different model families. This allows you to use the best tool for the job. When you combine these tools, you can create multi-agent workflows where each agent runs with the model best suited for that task.
39
+
40
+
With Docker, it is possible to have frontier models plan, while lightweight local models execute.
41
+
15
42
## Project-First Design
16
43
To get help from an assistant in your software development loop, the only context necessary is the project you are working on.
17
44
@@ -23,27 +50,24 @@ An extractor is a Docker image that runs against a project and extracts informat
23
50
## Prompts as a trackable artifact
24
51

25
52
26
-
Prompts are stored in a git repo and can be versioned, tracked, and shared.
27
-
28
-
## Dockerized Tools
29
-

30
-
31
-
OpenAI API compatiable LLM's already support function calling. This is our workbench to test the same spec, but with functions as Docker images. Some of the benefits using Docker based on our [research](https://www.linkedin.com/newsletters/docker-labs-genai-7204877599427194882/) are enabling the LLM to:
32
-
- take more complex actions
33
-
- deliver relevant context to the LLM without too many tokens
34
-
- work across a wider range of environments
35
-
- operate in a sandboxed environment
53
+
Prompts are stored in a git repo and can be versioned, tracked, and shared for anyone to run in their own environment.
36
54
37
55
# Get Started
56
+
We highly recommend using the VSCode extension to get started. It will help you create prompts, and run them with your own LLM.
57
+
58
+
You can install it in one-click with our Docker Desktop Extension:
0 commit comments