localai not outputting responses in Continue chat. #9087
Unanswered
drewdrew21b
asked this question in
Help
Replies: 4 comments 5 replies
-
|
API Side:
|
Beta Was this translation helpful? Give feedback.
1 reply
-
|
@uinstinct can you assist in debugging this. |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
@drewdrew21b your config seems alright to me. can you do few of the things below and tell me what you get?
|
Beta Was this translation helpful? Give feedback.
2 replies
-
|
Getting errors when having this applied.
Leaving the APIKey field removed, still see the same problems:
In discord, somebody on the team mentioned this might a known issue:
|
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment








Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Am I missing some basic setting to allow Continue to display responses from my localai service? This is what I have configured:
I have confirmed that when I check the console output, if I apply the same Options json output in the API directly, it works just fine.
Additional Details:
Beta Was this translation helpful? Give feedback.
All reactions