Skip to content

Conversation

@olimorris
Copy link
Owner

@olimorris olimorris commented Feb 12, 2025

Progress

  • System prompt has been revamped - much more structured and logical
  • LLMs now return XML:
<response>
  <code>print('Hello World')</code>
  <language>python</language>
  <placement>add</placement>
</response>

This makes it much easier to parse code and errors if they occur:

<response>
  <error>Unable to process the prompt as it appears to be random characters without any clear instruction or meaning.</error>
</response>
  • If the LLM returns XML in a markdown code block then we use Tree-sitter to attempt to parse it (something which was common for small parameter LLMs and Gemini 2.0). This will make the Inline Assistant much more resilient, however, this means that streaming is no longer possible
  • Improved test coverage
  • Adapter tests now test for inline output
  • Can now pass in the adapter as the first argument to the prompt
  • Prompts can now make use of variables
  • Added two new variables: #chat and #buffer

@kjkknkil kjkknkil mentioned this pull request Feb 13, 2025
3 tasks
@cloudflare-workers-and-pages
Copy link

cloudflare-workers-and-pages bot commented Feb 15, 2025

Deploying codecompanion with  Cloudflare Pages  Cloudflare Pages

Latest commit: 68a33c1
Status: ✅  Deploy successful!
Preview URL: https://c2c19bb0.codecompanion.pages.dev
Branch Preview URL: https://feat-inline-assistant.codecompanion.pages.dev

View logs

@olimorris olimorris merged commit 3e4fb61 into main Feb 18, 2025
4 checks passed
@taketwo
Copy link

taketwo commented Feb 18, 2025

In one of my custom inline prompts I'm using placement = 'before|false' to place the response in the beginning of the buffer and auto-accept the diff (suggested here). Seems like the auto-acceptance does not work anymore. Is this accidental or was the feature purposefully removed?

@nshen
Copy link

nshen commented Feb 19, 2025

After the v2 update, my :CodeCompanion commands have almost stopped working. I've tried Copilot, DeepSeek, and Alibaba's qwen-omni-turbo-latest models.

@olimorris
Copy link
Owner Author

@nshen if this is the case, then please raise a support ticket so I can reproduce the issue.

@olimorris
Copy link
Owner Author

In one of my custom inline prompts I'm using placement = 'before|false' to place the response in the beginning of the buffer and auto-accept the diff (suggested here). Seems like the auto-acceptance does not work anymore. Is this accidental or was the feature purposefully removed?

The placement|boolean approach was discarded in the plugin many months ago. Apologies if my docs have been slow to reflect that. Also, auto-acceptance has never been a feature so I'm not sure where that came from? Do you mean auto placement? Because that's still very much supported for prompt library items.

@taketwo
Copy link

taketwo commented Feb 20, 2025

To be clear: I've never seen placement|boolean being documented anywhere. This syntax was suggested by a kind soul in the above-referenced comment.

The placement|boolean approach was discarded in the plugin many months ago.

Yet it worked well until this last release.

Also, auto-acceptance has never been a feature so I'm not sure where that came from?

It was suggested here: #92 (comment).

@olimorris olimorris deleted the feat/inline-assistant branch February 27, 2025 14:32
cleong14 pushed a commit to cleong14/codecompanion.nvim that referenced this pull request May 21, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants