Poll: Should mLLMCelltype Support Local Llama Models? #49
cafferychen777
started this conversation in
Polls
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Poll: Should mLLMCelltype Support Local Llama Models?
Background
There have been requests for mLLMCelltype to support local Llama models for cell type annotation. However, many providers (including those accessible through OpenRouter) already offer free access to the latest versions of Llama models.
Current Situation
Question
Given that cloud providers (including free options) already offer access to the latest Llama models, is there still a need for mLLMCelltype to support local Llama model deployment for single-cell annotation?
Options
Please vote and share your thoughts in the comments!
Beta Was this translation helpful? Give feedback.
All reactions