Replies: 6 comments
-
I'm not the author, but I guess #68 means you can bring external usage? Do you want to create an internal API? |
Beta Was this translation helpful? Give feedback.
-
Hi @objecti0n , thanks for your interest in our work! As @hsz0403 correctly mentioned (thanks!), you may refer to #68 and interact with those API models using Python. You can then use the models in LeanCopilot through https://github.com/lean-dojo/LeanCopilot?tab=readme-ov-file#bring-your-own-model. We welcome contributions! Please feel free to PR for added support of new models. A main entry point should be the |
Beta Was this translation helpful? Give feedback.
-
I‘m also willing to help to PR this, there's a lot of decoder-only models for LeanCopilot to choose including your great work:) @objecti0n |
Beta Was this translation helpful? Give feedback.
-
The code in this repo only use Transformer for generation which could be hard for deploying large models. I am considering using openai API which can help user to use chatgpt or vllm to deploy any remote and local models. |
Beta Was this translation helpful? Give feedback.
-
Thank you both for your willingness to contribute! I am converting this issue to a discussion and please feel free to PR. Let me know when things are ready for review or if you encounter any problems. |
Beta Was this translation helpful? Give feedback.
-
This is now added into support from release v1.5.3 on. The relevant PR is #97 and #122. Thank @objecti0n for the interesting proposal and @hsz0403 for implementing the PR! |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
If it is not possible now, I would like to contribute and support internlm-math-plus-1.8B via vllm/ollama/openai api.
Beta Was this translation helpful? Give feedback.
All reactions