-
Notifications
You must be signed in to change notification settings - Fork 24
## Weather Bot Agent Example with FunctionTool #119
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
c7d2e38
f68c798
32f26e6
fc01713
1195a21
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| OPENWEATHER_API_KEY=replace with yours | ||
| GEMINI_API_KEY=replace with yours | ||
| OPENAI_API_KEY=replace with yours | ||
| LLM_MODEL=gemini/gemini-2.0-flash |
| Original file line number | Diff line number | Diff line change | ||||||
|---|---|---|---|---|---|---|---|---|
| @@ -0,0 +1,124 @@ | ||||||||
| # Weather Bot | ||||||||
|
|
||||||||
| An intelligent weather assistant powered by AI that provides real-time weather information through both CLI and web interfaces. | ||||||||
|  | ||||||||
| ## Features | ||||||||
|
|
||||||||
| - 🤖 AI-powered conversational weather queries | ||||||||
| - 🌍 Real-time weather data from OpenWeather API | ||||||||
| - 💬 Dual interface: Command-line and Web UI | ||||||||
| - 🔧 Support for multiple LLM providers (Gemini, OpenAI) | ||||||||
| - ⚡ Fast and responsive | ||||||||
|
|
||||||||
|
|
||||||||
| ## Installation | ||||||||
|
|
||||||||
| ### 1. Clone the Repository | ||||||||
|
|
||||||||
| ```bash | ||||||||
| git clone https://github.com/jentic/standard-agent.git | ||||||||
| cd standard_agent | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 2. Install Dependencies | ||||||||
|
|
||||||||
| ```bash | ||||||||
| make install | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 3. Activate Virtual Environment | ||||||||
|
|
||||||||
| ```bash | ||||||||
| source .venv/bin/activate | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 4. Navigate to Weather Bot | ||||||||
|
|
||||||||
| ```bash | ||||||||
| cd examples/weather_bot | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 5. Install Weather Bot Requirements | ||||||||
|
|
||||||||
| ```bash | ||||||||
| pip3 install -r requirements.txt | ||||||||
| ``` | ||||||||
|
|
||||||||
| ## Configuration | ||||||||
|
|
||||||||
| ### 1. Create Environment File | ||||||||
|
|
||||||||
| Create a `.env` file in the `examples/weather_bot` directory: | ||||||||
|
|
||||||||
| ```bash | ||||||||
| touch .env | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 2. Add API Keys | ||||||||
|
|
||||||||
| Add the following environment variables to your `.env` file: | ||||||||
|
|
||||||||
| ```env | ||||||||
| OPENWEATHER_API_KEY=your_openweather_api_key_here | ||||||||
| GEMINI_API_KEY=your_gemini_api_key_here | ||||||||
| OPENAI_API_KEY=your_openai_api_key_here | ||||||||
| LLM_MODEL=gemini/gemini-2.0-flash | ||||||||
| ``` | ||||||||
|
|
||||||||
| ### 3. Get API Keys | ||||||||
|
|
||||||||
| - **OpenWeather API Key**: [https://home.openweathermap.org/api_keys](https://home.openweathermap.org/api_keys) | ||||||||
| - **Gemini API Key**: [https://aistudio.google.com/api-keys](https://aistudio.google.com/api-keys) | ||||||||
| - **OpenAI API Key** (optional): [https://platform.openai.com/api-keys](https://platform.openai.com/api-keys) | ||||||||
|
|
||||||||
| ## Usage | ||||||||
|
|
||||||||
| ### Command Line Interface (CLI) | ||||||||
|
|
||||||||
| Run the weather bot in your terminal: | ||||||||
|
|
||||||||
| ```bash | ||||||||
| python -m app.cli_bot | ||||||||
| ``` | ||||||||
|
|
||||||||
|
Comment on lines
+81
to
+83
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove leading whitespace from command. The indentation before - python -m app.cli_bot
+python -m app.cli_bot🤖 Prompt for AI Agents |
||||||||
| You can then interact with the bot by typing natural language queries like: | ||||||||
| - "What's the weather in London?" | ||||||||
| - "Will it rain in Tokyo tomorrow?" | ||||||||
|
|
||||||||
| ### Web Interface | ||||||||
|
|
||||||||
| Start the web server: | ||||||||
|
|
||||||||
| ```bash | ||||||||
| fastapi run app/bot.py | ||||||||
| ``` | ||||||||
|
Comment on lines
+93
to
+94
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Remove leading whitespace from command. Same issue as above—leading whitespace before the command. - fastapi run app/bot.py
+fastapi run app/bot.py📝 Committable suggestion
Suggested change
🤖 Prompt for AI Agents |
||||||||
|
|
||||||||
| The web interface will be available at: | ||||||||
| - **Local**: http://localhost:8000/chat | ||||||||
| - **API Docs**: http://localhost:8000/docs | ||||||||
rishikesh-jentic marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||
|
|
||||||||
| Open your browser and navigate to the local URL to interact with the weather bot through a user-friendly web interface. | ||||||||
|
|
||||||||
| ## Supported LLM Models | ||||||||
|
|
||||||||
| You can configure different LLM models by changing the `LLM_MODEL` variable in your `.env` file: | ||||||||
|
|
||||||||
| - `gemini/gemini-2.0-flash` | ||||||||
| - `gemini/gemini-1.5-pro` | ||||||||
| - `openai/gpt-4` | ||||||||
| - `openai/gpt-3.5-turbo` | ||||||||
|
|
||||||||
| ## Project Structure | ||||||||
|
|
||||||||
| ``` | ||||||||
| examples/weather_bot/ | ||||||||
| ├── app/ | ||||||||
| │ ├── agent.py # CLI agent implementation | ||||||||
| │ └── ... | ||||||||
| ├── requirements.txt # Python dependencies | ||||||||
| ├── .env # Environment variables (create this) | ||||||||
| └── README.md # This file | ||||||||
| ``` | ||||||||
rishikesh-jentic marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||||
|
|
||||||||
|
|
||||||||
|  | ||||||||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Same typo in image path. This image reference also uses "screenshorts" instead of "screenshots". -
+📝 Committable suggestion
Suggested change
🤖 Prompt for AI Agents |
||||||||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,27 @@ | ||
| from dotenv import load_dotenv | ||
| from agents.standard_agent import StandardAgent | ||
| from agents.llm.litellm import LiteLLM | ||
| from agents.memory.dict_memory import DictMemory | ||
| from agents.reasoner.react import ReACTReasoner | ||
| from .weather_tools import func_tools | ||
|
|
||
|
|
||
| load_dotenv() | ||
|
|
||
|
|
||
| # try changing to you own prefered model and add the API key in the .env file/ environment variable | ||
| llm = LiteLLM(max_tokens=50) | ||
|
Comment on lines
+12
to
+13
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
A 50-token limit will severely truncate most LLM responses. Weather descriptions, forecasts, and conversational replies typically require 200-500+ tokens. This will cause incomplete or cut-off answers. Additionally, the comment mentions changing the model, but no Proposed fix+import os
+
# try changing to you own prefered model and add the API key in the .env file/ environment variable
-llm = LiteLLM(max_tokens=50)
+llm = LiteLLM(model=os.getenv("LLM_MODEL"), max_tokens=1024)🤖 Prompt for AI Agents |
||
|
|
||
| tools = func_tools | ||
| memory = DictMemory() | ||
|
|
||
| # Step 2: Pick a reasoner profile (single-file implementation) | ||
| custom_reasoner = ReACTReasoner(llm=llm, tools=tools, memory=memory, max_turns=5) | ||
|
|
||
| # Step 3: Wire everything together in the StandardAgent | ||
| weather_agent = StandardAgent( | ||
| llm=llm, | ||
| tools=tools, | ||
| memory=memory, | ||
| reasoner=custom_reasoner, | ||
| ) | ||
Uh oh!
There was an error while loading. Please reload this page.