You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+14-19Lines changed: 14 additions & 19 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -29,11 +29,11 @@ Thank you to the following people:
29
29
-:speech_balloon:[Copilot Chat](https://github.com/features/copilot) meets [Zed AI](https://zed.dev/blog/zed-ai), in Neovim
30
30
-:electric_plug: Support for Anthropic, Copilot, GitHub Models, DeepSeek, Gemini, Mistral AI, Novita, Ollama, OpenAI, Azure OpenAI, HuggingFace and xAI LLMs (or [bring your own](https://codecompanion.olimorris.dev/extending/adapters.html))
31
31
-:heart_hands: User contributed and supported [adapters](https://codecompanion.olimorris.dev/configuration/adapters#community-adapters)
32
-
-:rocket: Inline transformations, code creation and refactoring
33
-
-:robot: Variables, Slash Commands, Agents/Tools and Workflows to improve LLM output
34
-
-:sparkles: Built in prompt library for common tasks like advice on LSP errors and code explanations
35
-
-:building_construction: Create your own custom prompts, Variables and Slash Commands
36
-
-:books: Have multiple chats open at the same time
32
+
-:rocket:[Inline transformations](https://codecompanion.olimorris.dev/usage/inline-assistant.html), code creation and refactoring
33
+
-:robot:[Variables](https://codecompanion.olimorris.dev/usage/chat-buffer/variables.html), [Slash Commands](https://codecompanion.olimorris.dev/usage/chat-buffer/slash-commands.html), [Agents/Tools](https://codecompanion.olimorris.dev/usage/chat-buffer/agents.html) and [Workflows](https://codecompanion.olimorris.dev/usage/workflows.html) to improve LLM output
34
+
-:sparkles: Built in [prompt library](https://codecompanion.olimorris.dev/usage/action-palette.html) for common tasks like advice on LSP errors and code explanations
35
+
-:building_construction: Create your own [custom prompts](https://codecompanion.olimorris.dev/extending/prompts.html), Variables and Slash Commands
36
+
-:books: Have [multiple chats](https://codecompanion.olimorris.dev/usage/introduction.html#quickly-accessing-a-chat-buffer) open at the same time
37
37
-:muscle: Async execution for fast performance
38
38
39
39
<!-- panvimdoc-ignore-start -->
@@ -42,19 +42,15 @@ Thank you to the following people:
@@ -63,7 +59,7 @@ Thank you to the following people:
63
59
64
60
## :rocket: Getting Started
65
61
66
-
Please visit the [docs](https://codecompanion.olimorris.dev) for information on installation, configuration and usage.
62
+
Everything you need to know about CodeCompanion (installation, configuration and usage) is within the [docs](https://codecompanion.olimorris.dev).
67
63
68
64
## :toolbox: Troubleshooting
69
65
@@ -102,15 +98,14 @@ I am open to contributions but they will be implemented at my discretion. Feel f
102
98
## :clap: Acknowledgements
103
99
104
100
-[Steven Arcangeli](https://github.com/stevearc) for his genius creation of the chat buffer and his feedback early on
105
-
-[Manoel Campos](https://github.com/manoelcampos) for the [xml2lua](https://github.com/manoelcampos/xml2lua) library that's used in the tools implementation
101
+
-[Manoel Campos](https://github.com/manoelcampos) for the [xml2lua](https://github.com/manoelcampos/xml2lua) library that's used in the inline assistant implementation
106
102
-[Dante.nvim](https://github.com/S1M0N38/dante.nvim) for the beautifully simple diff implementation
107
103
-[Wtf.nvim](https://github.com/piersolenski/wtf.nvim) for the LSP assistant action
108
104
-[CopilotChat.nvim](https://github.com/CopilotC-Nvim/CopilotChat.nvim) for the rendering and usability of the chat
109
105
buffer
110
106
-[Aerial.nvim](https://github.com/stevearc/aerial.nvim) for the Tree-sitter parsing which inspired the symbols Slash
111
107
Command
112
-
-[Saghen](https://github.com/Saghen) for the fantastic docs inspiration from [blink.cmp](https://github.com/Saghen/blink.cmp)
113
-
-[Catwell](https://github.com/catwell) for the [queue](https://github.com/catwell/cw-lua/blob/master/deque/deque.lua)
114
-
inspiration that I use to stack agents and tools
115
-
108
+
-[Saghen](https://github.com/Saghen) for the fantastic docs inspiration from [blink.cmp](https://github.com/Saghen/blink.cmp) and continued PRs to the project
109
+
-[Catwell](https://github.com/catwell) for the [queue](https://github.com/catwell/cw-lua/blob/master/deque/deque.lua) inspiration that I use to stack agents and tools
110
+
-[ravitemer](https://github.com/ravitemer) for the fantastic extensions API
Copy file name to clipboardExpand all lines: codecompanion-workspace.json
+7-2Lines changed: 7 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -57,7 +57,7 @@
57
57
},
58
58
{
59
59
"name": "Tools",
60
-
"system_prompt": "In the CodeCompanion plugin, tools can be leveraged by an LLM to execute lua functions or shell commands on the users machine. By responding with XML, CodeCompanion will pass the response, call the corresponding tool. This feature has been implemented via the agent/init.lua file, which passes all of the tools and adds them to a queue. Then those tools are run consecutively by the executor/init.lua file.",
60
+
"system_prompt": "In the CodeCompanion plugin, tools can be leveraged by an LLM to execute lua functions or shell commands on the users machine. CodeCompanion uses an LLM's native function calling to receive a response in JSON, parse the response and call the corresponding tool. This feature has been implemented via the agent/init.lua file, which passes all of the tools and adds them to a queue. Then those tools are run consecutively by the executor/init.lua file.",
"description": "The `${filename}` file is the entry point for the chat strategy, which is called the `chat buffer`. All methods directly relating to the chat buffer reside here."
104
104
},
105
+
"chat-messages": {
106
+
"type": "file",
107
+
"path": "tests/stubs/messages.lua",
108
+
"description": "This is an example of what the messages table looks like. This is the table which contains all of the user and LLM messages that are sent to the LLM. It also includes output from the tools that the LLM has requested to run. The role is the person that has sent the message. Content is the message itself. Cycle is a way of grouping messages together in a turn where a turn is defined as a user message and a LLM message (sometimes a tool message too"
"description": "This is the entry point for the agent. If XML is detected in an LLM's response then this file is triggered which in turns add tools to a queue before calling the executor"
175
+
"description": "This is the entry point for the agent. If an LLM's response includes a function call (or tool call) then this file is triggered which in turns add tools to a queue before calling the executor"
0 commit comments