You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched existing ideas and did not find a similar one
I added a very descriptive title
I've clearly described the feature request and motivation for it
Feature request
Hi LangChain team,
I wanted to ask if there is any plan or ongoing work to support the new OpenAI Prompt Management API https://community.openai.com/t/enhanced-prompt-management/1290305 in LangChain, specifically the ability to call OpenAI chat completions using saved prompt templates via the prompt_id parameter.
describes a new feature that allows versioning and managing prompts directly in OpenAI's platform and referencing them via prompt_id in the API call.
Currently, it seems LangChain's OpenAI or ChatOpenAI classes do not support passing a prompt_id to the API, and I wanted to know:
Are there plans to add native support for this feature soon?
Any recommended workarounds in the meantime to integrate OpenAI prompt management with LangChain?
Thanks for the great work on LangChain! Looking forward to your feedback.
Best regards
Motivation
Using the Prompt Management API would allow better prompt version control, centralized management, and easier experimentation with prompt iterations without changing client code. This could help LangChain users maintain high-quality prompts and improve collaboration in teams.
Proposal (If applicable)
A possible approach could be to extend the existing OpenAI or ChatOpenAI class to accept a prompt_id parameter, and modify the API call to include it instead of passing raw prompt strings. This might be implemented as a new parameter with backward compatibility, allowing users to opt-in to prompt management features.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Hi LangChain team,
I wanted to ask if there is any plan or ongoing work to support the new OpenAI Prompt Management API https://community.openai.com/t/enhanced-prompt-management/1290305 in LangChain, specifically the ability to call OpenAI chat completions using saved prompt templates via the
prompt_id
parameter.The OpenAI documentation here:
https://platform.openai.com/docs/guides/text?api-mode=responses#reusable-prompts
describes a new feature that allows versioning and managing prompts directly in OpenAI's platform and referencing them via
prompt_id
in the API call.Currently, it seems LangChain's
OpenAI
orChatOpenAI
classes do not support passing aprompt_id
to the API, and I wanted to know:Thanks for the great work on LangChain! Looking forward to your feedback.
Best regards
Motivation
Using the Prompt Management API would allow better prompt version control, centralized management, and easier experimentation with prompt iterations without changing client code. This could help LangChain users maintain high-quality prompts and improve collaboration in teams.
Proposal (If applicable)
A possible approach could be to extend the existing OpenAI or ChatOpenAI class to accept a prompt_id parameter, and modify the API call to include it instead of passing raw prompt strings. This might be implemented as a new parameter with backward compatibility, allowing users to opt-in to prompt management features.
Beta Was this translation helpful? Give feedback.
All reactions