@@ -34,46 +34,8 @@ OpenAI API compatiable LLM's already support function calling. This is our workb
34
34
- work across a wider range of environments
35
35
- operate in a sandboxed environment
36
36
37
-
38
37
# Get Started
39
38
40
- ## Basics: Render a Prompt
41
-
42
- To generate prompts for a project, clone a repo into ` $PWD ` and run the
43
- following command.
44
-
45
- ``` sh
46
- docker run --rm \
47
- --pull=always \
48
- -v /var/run/docker.sock:/var/run/docker.sock \
49
- --mount type=volume,source=docker-prompts,target=/prompts \
50
- vonwig/prompts:latest --host-dir $PWD \
51
- --user jimclark106 \
52
- --platform darwin \
53
- --prompts-dir " github:docker/labs-make-runbook?ref=main&path=prompts/lazy_docker"
54
- ```
55
-
56
- The four arguments are
57
- * ` project root dir on host ` ,
58
- * ` docker login username ` ,
59
- * ` platform ` ,
60
- * ` github ref ` for prompt files.
61
-
62
- If you need to test prompts before you push, you can bind a local prompt directory instead of using
63
- a GitHub ref. For example, to test some local prompts in a directory named ` my_prompts ` , run:
64
-
65
- ``` sh
66
- docker run --rm \
67
- --pull=always \
68
- -v /var/run/docker.sock:/var/run/docker.sock \
69
- --mount type=bind,source=$PWD ,target=/app/my_prompts \
70
- --workdir /app \
71
- vonwig/prompts:latest --host-dir $PWD \
72
- --user jimclark106 \
73
- --platform darwin \
74
- --prompts-dir my_prompts
75
- ```
76
-
77
39
## Running a Conversation Loop
78
40
79
41
Set OpenAI key
@@ -82,51 +44,23 @@ echo $OPENAI_API_KEY > $HOME/.openai-api-key
82
44
```
83
45
Run
84
46
``` sh
85
- docker run --rm \
86
- -it \
87
- -v /var/run/docker.sock:/var/run/docker.sock \
88
- --mount type=volume,source=docker-prompts,target=/prompts \
89
- --mount type=bind,source=$HOME /.openai-api-key,target=/root/.openai-api-key \
90
- vonwig/prompts:latest \
91
- run \
92
- --host-dir $PWD \
93
- --user $USER \
94
- --platform " $( uname -o) " \
95
- --prompts " github:docker/labs-githooks?ref=main&path=prompts/git_hooks"
96
- ```
97
-
98
- ### Running a Conversation Loop with Local Prompts
99
-
100
- If you want to run a conversation loop with local prompts then you need to think about two different directories, the one that the root of your project ($PWD above),
101
- and the one that contains your prompts (let's call that $PROMPTS_DIR). Here's a command line for running the prompts when our $PWD is the project root and we've set the environment variable
102
- $PROMPTS_DIR to point at the directory containing our prompts.
103
-
104
- Set OpenAI key
105
- ``` sh
106
- echo $OPENAI_API_KEY > $HOME /.openai-api-key
107
- ```
108
- Run
109
- ``` sh
110
- docker run --rm \
111
- -it \
112
- -v /var/run/docker.sock:/var/run/docker.sock \
113
- --mount type=bind,source=$PROMPTS_DIR ,target=/app/my_prompts \
114
- --workdir /app \
115
- --mount type=volume,source=docker-prompts,target=/prompts \
116
- --mount type=bind,source=$HOME /.openai-api-key,target=/root/.openai-api-key \
117
- vonwig/prompts:latest \
118
- run \
119
- --host-dir $PWD \
120
- --user $USER \
121
- --platform " $( uname -o) " \
122
- --prompts-dir my_prompts
123
- ```
124
-
125
- ## GitHub refs
126
-
127
- Prompts can be fetched from a GitHub repository when using the ` --prompts ` arg. The mandatory parts of the ref are ` github:{owner}/{repo} `
128
- but optional ` path ` and ` ref ` can be added to pull prompts from branches, and to specify a subdirectory
129
- where the prompt files are located in the repo.
47
+ docker run
48
+ --rm \
49
+ --pull=always \
50
+ -it \
51
+ -v /var/run/docker.sock:/var/run/docker.sock \
52
+ --mount type=volume,source=docker-prompts,target=/prompts \
53
+ --mount type=bind,source=$HOME /.openai-api-key,target=/root/.openai-api-key \
54
+ vonwig/prompts:latest \
55
+ run \
56
+ --host-dir $PWD \
57
+ --user $USER \
58
+ --platform " $( uname -o) " \
59
+ --prompts " github:docker/labs-githooks?ref=main&path=prompts/git_hooks"
60
+ ```
61
+
62
+ See [ docs] ( https://vonwig.github.io/prompts.docs/#/page/running%20the%20prompt%20engine ) for more details on how to run the conversation loop,
63
+ and especially how to use it to run local prompts that are not yet in GitHub.
130
64
131
65
## Function volumes
132
66
0 commit comments