This Agent is an AI-powered conversational tool for accessing trends on X. It is built using LangGraph for orchestration and the Agent Stack SDK, specializing in identifying trending topics by country, researching the context behind each trend, and providing friendly, engaging summaries with emojis and source citations.
The agent is designed to run entirely locally using Ollama and the IBM Granite 4 model.
Below is the visual representation of the agent's logic flow orchestrated by LangGraph:
graph TD
Start((Start)) --> GetTrends[analyze_and_get_trends]
GetTrends --> CountryDetect{Detect Country}
CountryDetect -->|Country Found| BuildURL[trends24.in/country/]
CountryDetect -->|No Country| MainURL[trends24.in/]
BuildURL --> ExtractTrends[Extract Top 5]
MainURL --> ExtractTrends[Extract Top 5]
ExtractTrends --> ResearchContext[research_trends_context]
ResearchContext --> Loop[Search Reason for each Trend]
Loop --> Synthesize[synthesize_report]
Synthesize --> End((End))
- Python: Version 3.11 or higher.
- Dependency Management:
uvis recommended for managing Python packages. - Ollama: Required for running the local LLM (
granite4:tiny-h).
Main packages used:
langchain&langgraph: For orchestrating the agent's reasoning flow.langchain-ollama: To interact with the local LLM.agentstack-sdk: For A2A protocol and server management.ddgs: Web search via DuckDuckGo.
-
Clone the repository:
git clone <repository-url> cd x_trends_agent
-
Install dependencies:
uv sync
-
Install Ollama and the model:
# Pull the required model ollama pull granite4:tiny-h
- Start the agent:
The agent will start and be ready on
uv run server
http://127.0.0.1:8002(configurable viaPORTenv var).
The agent uses a StateGraph (LangGraph) to process queries through a dynamic pipeline:
- Country Detection: Uses the LLM to identify if the user is asking about a specific region.
- Trend Retrieval: Scrapes
trends24.inspecifically for the target country (or global) to get the top 5 trending topics. - Context Research: For each of the top 5 trends, it performs targeted web searches to find the "why" behind the trend.
- Friendly Synthesis: Generates a coherent report with emojis (💡, 📰, 🚀) and, most importantly, includes the source URLs of the news found.
- "What's trending in Mexico?"
- "Show me global trends on X today"
- "What's happening in Spain?"
src/langgraph_agents/agent.py: Main logic of the agent, including the LangGraph definition and A2A server.pyproject.toml: Dependency and script definitions.
This agent is built for demonstration purposes. Ensure you have Ollama running locally for the LLM to function properly.