Skip to content

Commit d9763e4

Browse files
authored
Merge pull request #123572 from jonathanscholtes/patch-1
Update ai-agents.md
2 parents 43d8ccc + b60acf8 commit d9763e4

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

articles/cosmos-db/ai-agents.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -396,7 +396,7 @@ from langchain_core.runnables.history import RunnableWithMessageHistory
396396
from langchain.agents import AgentExecutor, create_openai_tools_agent
397397
from service import TravelAgentTools as agent_tools
398398

399-
load_dotenv(override=True)
399+
load_dotenv(override=False)
400400

401401

402402
chat : ChatOpenAI | None=None
@@ -438,7 +438,7 @@ def LLM_init():
438438
LLM_init()
439439
```
440440

441-
The **init.py** file commences by initiating the loading of environment variables from a **.env** file utilizing the ```load_dotenv(override=True)``` method. Then, a global variable named ```agent_with_chat_history``` is instantiated for the agent, intended for use by our **TravelAgent.py**. The ```LLM_init()``` method is invoked during module initialization to configure our AI agent for conversation via the API web layer. The OpenAI Chat object is instantiated using the GPT-3.5 model, incorporating specific parameters such as model name and temperature. The chat object, tools list, and prompt template are combined to generate an ```AgentExecutor```, which operates as our AI Travel Agent. Lastly, the agent with history, ```agent_with_chat_history```, is established using ```RunnableWithMessageHistory``` with chat history (MongoDBChatMessageHistory), enabling it to maintain a complete conversation history via Azure Cosmos DB.
441+
The **init.py** file commences by initiating the loading of environment variables from a **.env** file utilizing the ```load_dotenv(override=False)``` method. Then, a global variable named ```agent_with_chat_history``` is instantiated for the agent, intended for use by our **TravelAgent.py**. The ```LLM_init()``` method is invoked during module initialization to configure our AI agent for conversation via the API web layer. The OpenAI Chat object is instantiated using the GPT-3.5 model, incorporating specific parameters such as model name and temperature. The chat object, tools list, and prompt template are combined to generate an ```AgentExecutor```, which operates as our AI Travel Agent. Lastly, the agent with history, ```agent_with_chat_history```, is established using ```RunnableWithMessageHistory``` with chat history (MongoDBChatMessageHistory), enabling it to maintain a complete conversation history via Azure Cosmos DB.
442442

443443
#### Prompt
444444

@@ -507,7 +507,7 @@ from model.prompt import PromptResponse
507507
import time
508508
from dotenv import load_dotenv
509509

510-
load_dotenv(override=True)
510+
load_dotenv(override=False)
511511

512512

513513
def agent_chat(input:str, session_id:str)->str:

0 commit comments

Comments
 (0)