Replies: 1 comment
-
Without adding any code complexity, the LLM should be able to take instructions through GEMINI.md/AGENTS.md : "when user says /custom_command then take XYZ action." Similar to the workaround of apply_patch not (consistently) working in openAi's codex cli remedied via custom instructions in AGENTS.md: Another option, for something really complex, is to write a custom MCP (via FastMCP ) and plug in your custom mcp with custom_command tool. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
It would be great to see the ability to add custom / commands. I know there is the ability to make extensions, I am not certain this is the same thing, or how to use extensions.
Beta Was this translation helpful? Give feedback.
All reactions