Replies: 3 comments 1 reply
-
🤖 Hello, To get the output of a LlamaCpp language model into a string variable for post-processing, you can use the from langchain.llms import LlamaCpp
from langchain.schema.output_parser import StrOutputParser
from langchain.output_parsers.combining import CombiningOutputParser
# Initialize the LlamaCpp model
llm = LlamaCpp(model_path="/path/to/llama/model")
# Call the model with a prompt
output = llm._call("This is a prompt.")
# Initialize the StrOutputParser and CombiningOutputParser
str_parser = StrOutputParser()
combining_parser = CombiningOutputParser(parsers=[str_parser])
# Parse the output into a dictionary
parsed_output = combining_parser.parse(output)
# Now the parsed_output variable contains the generated text as a dictionary
print(parsed_output) In this example, If you are encountering an error with the The
I hope this helps! If you have any other questions, feel free to ask. Best, Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
This is the same exact problem I have. |
Beta Was this translation helpful? Give feedback.
-
The bot's solution has a very simple clue, you don't need to do all of the
code.
result = llm._call(prompt)
does the trick.
…On Sat, Sep 30, 2023 at 8:08 PM dragonskyside ***@***.***> wrote:
I would like to be able to do the following:
llm = LlamaCpp { model = "model", .... }
result = llm(prompt)
post process the result.
I cannot find any relevant code to do this.
Hopefully someone has a suggestion how to do this.
I found something with StrOutputParser but
llm does not have a pipe method, was the error.
I would like to be able to do the following:
llm = LlamaCpp { model = "model", .... }
result = llm(prompt)
post process the result.
I cannot find any relevant code to do this.
Hopefully someone has a suggestion how to do this.
I found something with StrOutputParser but
llm does not have a pipe method, was the error.
This is the same exact problem I have.
—
Reply to this email directly, view it on GitHub
<#11253 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABDD3ZINZNIMZS5DIDNTAOTX5CYABANCNFSM6AAAAAA5NW5DGE>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
com>
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I would like to be able to do the following:
llm = LlamaCpp {
model = "model",
....
}
result = llm(prompt)
post process the result.
I cannot find any relevant code to do this.
Hopefully someone has a suggestion how to do this.
I found something with StrOutputParser but
llm does not have a pipe method, was the error.
Beta Was this translation helpful? Give feedback.
All reactions