HTTPSConnectionPool error when using planner.create_openapi_agent #15895
-
Hello, I am trying to use an openapi agent to make call to a custom API (served by https) but I keep getting this error message: I have tried to add requests_kwargs = {"verify": False} to the agent but no luck. Here is the snippet of code I am using: from langchain_openai import AzureOpenAI
from langchain_community.agent_toolkits.openapi.spec import reduce_openapi_spec
from langchain_community.agent_toolkits.openapi import planner
from langchain.requests import RequestsWrapperimport yaml
llm = AzureChatOpenAI(
azure_deployment="gpt-4-128k",
model_name="gpt-4",
temperature=0)
with open("./swagger.yaml") as f:
raw_openai_api_spec = yaml.load(f, Loader=yaml.Loader)
kafka_api_spec = reduce_openapi_spec(raw_openai_api_spec)
requests_wrapper = RequestsWrapper()
agent = planner.create_openapi_agent(kafka_api_spec, requests_wrapper, llm, requests_kwargs = {"verify": False})
user_query = (
"what is the list of available topics ?"
)
agent.run(input=user_query) Does anyone have an idea on how tp specify a certificate and /or say insecure = yes ? Thanks |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 1 reply
-
🤖 Hello @lguillaud! I'm Dosu, a friendly bot here to help you navigate bugs, answer questions, and guide you on your journey as a contributor while we wait for a human maintainer. Delighted to meet you and looking forward to assisting you! Based on the information available in the LangChain repository, the You can specify a custom SSL certificate for a request by passing the path to the certificate file as the Here is an example of how you can use a custom SSL certificate with the requests_wrapper = RequestsWrapper()
response = requests_wrapper.get('https://example.com', cert='/path/to/certfile') In this example, the However, I wasn't able to find information on whether the If you have the SSL certificate file for your server, you could try passing it to the I hope this helps! If you have any other questions, feel free to ask. -Dosu SourcesThis response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Thanks to the bot but it does not answer the question. Or if someone has an example of how to use an openapi specs to make calls with this very specific option verify=False ? |
Beta Was this translation helpful? Give feedback.
-
Hi @lguillaud, Can you try to update class RequestsGetToolWithParsing(BaseRequestsTool, BaseTool):
"""Requests GET tool with LLM-instructed extraction of truncated responses."""
name: str = "requests_get"
"""Tool name."""
description = REQUESTS_GET_TOOL_DESCRIPTION
"""Tool description."""
response_length: int = MAX_RESPONSE_LENGTH
"""Maximum length of the response to be returned."""
llm_chain: Any = Field(
default_factory=_get_default_llm_chain_factory(PARSING_GET_PROMPT)
)
"""LLMChain used to extract the response."""
def _run(self, text: str) -> str:
from langchain.output_parsers.json import parse_json_markdown
try:
data = parse_json_markdown(text)
except json.JSONDecodeError as e:
raise e
data_params = data.get("params")
response: str = cast(
str, self.requests_wrapper.get(data["url"], params=data_params, verify=false)
)
response = response[: self.response_length]
return self.llm_chain.predict(
response=response, instructions=data["output_instructions"]
).strip() I changed line 87 : response: str = cast(
str, self.requests_wrapper.get(data["url"], params=data_params, verify=false)
) Tell me if that fix the problem. If so, we'll need to create PR to fix it |
Beta Was this translation helpful? Give feedback.
-
With latest version of langchain you may set |
Beta Was this translation helpful? Give feedback.
With latest version of langchain you may set
requests_wrapper.verify = False
to bypass certificate verification.