Got the message [multimodal] (one or multi times) when I am using #7219
Unanswered
fanningert
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I am new in the topic LocalAI.
I run LocalAI on a Mini Forum MS-01 (only for test) with the Intel docker image.
Here my docker compose
But when I run gemma-3 models (for example Gemma-3-4b-it-qat), some times I get a full answer, some time a partial answer with "[multimodal]" at the end and many times I get as answer one or more entries of [multimodal].
Here the model configuration.
I don't find any Information about this message ("[multimodal]"). Has any one thing tips and trick to this point for me.
A note at the end. MCP is configured, but not used in this test.
Beta Was this translation helpful? Give feedback.
All reactions