Replies: 1 comment
-
Hi, I am with same error here, but from my Server Side I also tested with a LangGraph Runnable. Here is the error:
It looks like an error inside of module
Output: Does anyone knows how to solve it in LangServe? Why we have an endpoint like |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am invoking RemoteRunnable to remote langserve instance that hosts a chain_with_guardrail and facing error. When I call the invoke method on chain without guardrail then it works fine.
Server Side
chain = (
{"context": reranking_retriever, "question": RunnablePassthrough()}
| prompt_template
| llm
| StrOutputParser()
)
chain_with_guardrails = guardrails | chain
add_routes(
app,
chain_with_guardrails,
path="/conversational_ai_with_rag_text_chat"
)
Client Side
llm = RemoteRunnable("http://127.0.0.1:90000") | StrOutputParser()
llm.invoke("Tell me something about Langchain")
this invoke method is erroring out if the langserve chain is having "chain_with_guardrails" but not erroring with just chain
Beta Was this translation helpful? Give feedback.
All reactions