Replies: 1 comment
-
Answer generated by a 🤖 AnswerI understand that you're trying to use a To clarify, the Here's a snippet of the class ChatOpenAI(BaseChatModel):
"""Wrapper around OpenAI Chat large language models.
To use, you should have the ``openai`` python package installed, and the
environment variable ``OPENAI_API_KEY`` set with your API key.
Any parameters that are valid to be passed to the openai.create call can be passed
in, even if not explicitly saved on this class.
Example:
.. code-block:: python
from langchain.chat_models import ChatOpenAI
openai = ChatOpenAI(model_name="gpt-3.5-turbo")
""" As you can see, there's no 'save' method defined in this class. This is the most likely explanation based on the information you've provided. However, if you believe this is a bug or an oversight that could impact other users, you're welcome to make a pull request to address it. If my explanation contradicts your experience, I would recommend ensuring that you're using the latest version of the LangChain framework. There might have been updates or changes that could potentially resolve your issue. SourcesThis response is meant to be useful, save you time, and share context. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
https://python.langchain.com/docs/modules/model_io/models/llms/how_to/llm_serialization
Hi! ✌️ Is there en equivalent of .save method for ChatOpenAI?
returns
Beta Was this translation helpful? Give feedback.
All reactions