Skip to content

Commit ae5695a

Browse files
authored
Update cohere.py (#795)
When stop tokens are set in Cohere LLM constructor, they are currently not stripped from the response, and they should be stripped
1 parent cacf409 commit ae5695a

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

langchain/llms/cohere.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -122,6 +122,6 @@ def _call(self, prompt: str, stop: Optional[List[str]] = None) -> str:
122122
text = response.generations[0].text
123123
# If stop tokens are provided, Cohere's endpoint returns them.
124124
# In order to make this consistent with other endpoints, we strip them.
125-
if stop is not None:
126-
text = enforce_stop_tokens(text, stop)
125+
if stop is not None or self.stop is not None:
126+
text = enforce_stop_tokens(text, params["stop_sequences"])
127127
return text

0 commit comments

Comments
 (0)