Replies: 3 comments 1 reply
-
The Here is the relevant code snippet from the def invoke(
self, input: Input, config: Optional[RunnableConfig] = None, **kwargs: Any
) -> Output:
if self.exception_key is not None and not isinstance(input, dict):
raise ValueError(
"If 'exception_key' is specified then input must be a dictionary."
f"However found a type of {type(input)} for input"
)
# setup callbacks
config = ensure_config(config)
callback_manager = get_callback_manager_for_config(config)
# start the root run
run_manager = callback_manager.on_chain_start(
dumpd(self),
input,
name=config.get("run_name"),
run_id=config.pop("run_id", None),
)
... In contrast, when using just Additionally, the Here is the relevant code snippet from the def _validate_inputs(self, inputs: Dict[str, Any]) -> None:
"""Check that all inputs are present."""
if not isinstance(inputs, dict):
_input_keys = set(self.input_keys)
if self.memory is not None:
_input_keys = _input_keys.difference(self.memory.memory_variables)
if len(_input_keys) != 1:
raise ValueError(
f"A single string input was passed in, but this chain expects "
f"multiple inputs ({_input_keys}). When a chain expects "
f"multiple inputs, please call it by passing in a dictionary, "
"eg `chain({'foo': 1, 'bar': 2})`"
)
missing_keys = set(self.input_keys).difference(inputs)
if missing_keys:
raise ValueError(f"Missing some input keys: {missing_keys}") Therefore, the error occurs because |
Beta Was this translation helpful? Give feedback.
-
Is there any example of how RemoteRunnable can call the invoke function with dictionary as input to remote chain_with_guardrails |
Beta Was this translation helpful? Give feedback.
-
I changed my code to output = llm.invoke({'query': 'How easy is it to implement RAG?'}) Now, I am getting error on the remote server side File "/stage/ashish/venv310/lib/python3.10/site-packages/nemoguardrails/rails/llm/utils.py", line 43, in get_history_cache_key I have not done any change on the server side code. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
this invoke method is erroring out if the langserve chain is having "chain_with_guardrails" but not erroring with just chain
System Info
pip freeze|grep -i langchain
langchain==0.1.11
langchain-community==0.0.25
langchain-core==0.2.33
langchain-nvidia-ai-endpoints==0.1.4
langchain-text-splitters==0.2.2
Beta Was this translation helpful? Give feedback.
All reactions