You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat: Enhance n8dex with multi-LLM, local search, and UI improvements
This commit introduces several key enhancements to the n8dex project:
1. **Multi-LLM Compatibility:**
* The backend now supports configurable LLM providers (Gemini, OpenRouter, DeepSeek) via environment variables (`LLM_PROVIDER`, `LLM_API_KEY`, etc.).
* The frontend includes a dropdown to select the desired LLM provider.
2. **Local Network Search:**
* Added functionality to search local network HTML content.
* Configuration via environment variables (`ENABLE_LOCAL_SEARCH`, `LOCAL_SEARCH_DOMAINS`, `SEARCH_MODE`).
* The frontend provides a "Search Scope" dropdown to control search behavior (Internet only, Local only, combined modes).
3. **LangSmith Monitoring Toggle:**
* Backend respects `LANGSMITH_ENABLED` environment variable for global control.
* Frontend UI includes a toggle for your preference regarding LangSmith tracing, passed to the backend.
4. **Frontend UI Enhancements:**
* Updated the overall theme to a brighter, more enterprise-friendly light theme.
* Added UI elements for selecting LLM provider, LangSmith preference, and search scope.
* Improved styling of chat messages and input forms.
5. **Backend Refinements & Testing:**
* Refactored backend configuration and graph logic to support new features.
* Added a suite of unit tests for backend components (configuration, graph logic, local search tool) to ensure stability.
6. **Documentation:**
* Updated `README.md` extensively to cover all new features, environment variables, and UI options.
Note: Integration of specific "Finance" and "HR" frontend sections is deferred pending example code.
Copy file name to clipboardExpand all lines: README.md
+83-13Lines changed: 83 additions & 13 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -8,12 +8,20 @@ This project demonstrates a fullstack application using a React frontend and a L
8
8
9
9
- 💬 Fullstack application with a React frontend and LangGraph backend.
10
10
- 🧠 Powered by a LangGraph agent for advanced research and conversational AI.
11
-
- 🔍 Dynamic search query generation using Google Gemini models.
11
+
- 💡 **Multi-LLM Support:** Flexibility to use different LLM providers (Gemini, OpenRouter, DeepSeek).
12
+
- 🔍 Dynamic search query generation using the configured LLM.
12
13
- 🌐 Integrated web research via Google Search API.
14
+
- 🏠 **Local Network Search:** Optional capability to search within configured local domains.
15
+
- 🔄 **Flexible Search Modes:** Control whether to search internet, local network, or both, and in which order.
13
16
- 🤔 Reflective reasoning to identify knowledge gaps and refine searches.
14
17
- 📄 Generates answers with citations from gathered sources.
18
+
- 🎨 **Updated UI Theme:** Modern, light theme for improved readability and a professional look.
19
+
- 🛠️ **Configurable Tracing:** LangSmith tracing can be enabled/disabled.
15
20
- 🔄 Hot-reloading for both frontend and backend development during development.
16
21
22
+
### Upcoming Features
23
+
- Dedicated "Finance" and "HR" sections for specialized research tasks.
24
+
17
25
## Project Structure
18
26
19
27
The project is divided into two main directories:
@@ -29,10 +37,7 @@ Follow these steps to get the application running locally for development and te
29
37
30
38
- Node.js and npm (or yarn/pnpm)
31
39
- Python 3.8+
32
-
-**`GEMINI_API_KEY`**: The backend agent requires a Google Gemini API key.
33
-
1. Navigate to the `backend/` directory.
34
-
2. Create a file named `.env` by copying the `backend/.env.example` file.
35
-
3. Open the `.env` file and add your Gemini API key: `GEMINI_API_KEY="YOUR_ACTUAL_API_KEY"`
40
+
-**API Keys & Configuration:** The backend agent requires API keys depending on the chosen LLM provider and other features. See the "Configuration" section below for details on setting up your `.env` file in the `backend/` directory.
36
41
37
42
**2. Install Dependencies:**
38
43
@@ -42,6 +47,11 @@ Follow these steps to get the application running locally for development and te
42
47
cd backend
43
48
pip install .
44
49
```
50
+
*Note: If you plan to use the Local Network Search feature, ensure you install its dependencies:*
51
+
```bash
52
+
pip install ".[local_search]"
53
+
```
54
+
*(Or `pip install requests beautifulsoup4` if you manage dependencies manually)*
45
55
46
56
**Frontend:**
47
57
@@ -57,21 +67,76 @@ npm install
57
67
```bash
58
68
make dev
59
69
```
60
-
This will run the backend and frontend development servers. Open your browser and navigate to the frontend development server URL (e.g., `http://localhost:5173/app`).
70
+
This will run the backend and frontend development servers. Open your browser and navigate to the frontend development server URL (e.g., `http://localhost:5173/app`).
61
71
62
72
_Alternatively, you can run the backend and frontend development servers separately. For the backend, open a terminal in the `backend/` directory and run `langgraph dev`. The backend API will be available at `http://127.0.0.1:2024`. It will also open a browser window to the LangGraph UI. For the frontend, open a terminal in the `frontend/` directory and run `npm run dev`. The frontend will be available at `http://localhost:5173`._
63
73
74
+
## Configuration
75
+
76
+
Create a `.env` file in the `backend/` directory by copying `backend/.env.example`. Below are the available environment variables:
77
+
78
+
### Core Agent & LLM Configuration
79
+
-`GEMINI_API_KEY`: Your Google Gemini API key. Required if using "gemini" as the LLM provider for any task or for Google Search functionality.
80
+
-`LLM_PROVIDER`: Specifies the primary LLM provider for core agent tasks (query generation, reflection, answer synthesis).
-`LLM_API_KEY`: The API key for the selected `LLM_PROVIDER`.
84
+
- Example: If `LLM_PROVIDER="openrouter"`, this should be your OpenRouter API key.
85
+
-`OPENROUTER_MODEL_NAME`: Specify the full model string if using OpenRouter (e.g., `"anthropic/claude-3-haiku"`). This can be used by the agent if specific task models are not set.
86
+
-`DEEPSEEK_MODEL_NAME`: Specify the model name if using DeepSeek (e.g., `"deepseek-chat"`). This can be used by the agent if specific task models are not set.
87
+
-`QUERY_GENERATOR_MODEL`: Model used for generating search queries. Interpreted based on `LLM_PROVIDER`.
88
+
- Default for Gemini: `"gemini-1.5-flash"`
89
+
-`REFLECTION_MODEL`: Model used for reflection and knowledge gap analysis. Interpreted based on `LLM_PROVIDER`.
90
+
- Default for Gemini: `"gemini-1.5-flash"`
91
+
-`ANSWER_MODEL`: Model used for synthesizing the final answer. Interpreted based on `LLM_PROVIDER`.
92
+
- Default for Gemini: `"gemini-1.5-pro"`
93
+
-`NUMBER_OF_INITIAL_QUERIES`: Number of initial search queries to generate. Default: `3`.
94
+
-`MAX_RESEARCH_LOOPS`: Maximum number of research refinement loops. Default: `2`.
95
+
96
+
### LangSmith Tracing
97
+
-`LANGSMITH_ENABLED`: Master switch to enable (`true`) or disable (`false`) LangSmith tracing for the backend. Default: `true`.
98
+
- If `true`, various LangSmith environment variables below should also be set.
99
+
- If `false`, tracing is globally disabled for the application process, and the UI toggle cannot override this.
100
+
-`LANGCHAIN_API_KEY`: Your LangSmith API key. Required if `LANGSMITH_ENABLED` is true.
101
+
-`LANGCHAIN_TRACING_V2`: Set to `"true"` to use the V2 tracing protocol. Usually managed by the `LANGSMITH_ENABLED` setting.
102
+
-`LANGCHAIN_ENDPOINT`: LangSmith API endpoint. Defaults to `"https://api.smith.langchain.com"`.
103
+
-`LANGCHAIN_PROJECT`: Name of the project in LangSmith.
104
+
105
+
### Local Network Search
106
+
-`ENABLE_LOCAL_SEARCH`: Set to `true` to enable searching within local network domains. Default: `false`.
107
+
-`LOCAL_SEARCH_DOMAINS`: A comma-separated list of base URLs or domains for local search.
-`SEARCH_MODE`: Defines the search behavior when both internet and local search capabilities might be active.
110
+
-`"internet_only"` (Default): Searches only the public internet.
111
+
*`"local_only"`: Searches only configured local domains (requires `ENABLE_LOCAL_SEARCH=true` and `LOCAL_SEARCH_DOMAINS` to be set).
112
+
*`"internet_then_local"`: Performs internet search first, then local search if enabled.
113
+
*`"local_then_internet"`: Performs local search first if enabled, then internet search.
114
+
115
+
## Frontend UI Settings
116
+
117
+
The user interface provides several controls to customize the agent's behavior for each query:
118
+
119
+
-**Effort Level:** (Low, Medium, High) - Adjusts the number of initial queries and maximum research loops.
120
+
-**Reasoning Model:** (Flash/Fast, Pro/Advanced) - Selects a class of model for reasoning tasks (reflection, answer synthesis). The actual model used depends on the selected LLM Provider.
121
+
-**LLM Provider:** (Gemini, OpenRouter, DeepSeek) - Choose the primary LLM provider for the current query. Requires corresponding API keys to be configured on the backend.
122
+
-**LangSmith Monitoring:** (Toggle Switch) - If LangSmith is enabled globally on the backend, this allows users to toggle tracing for their specific session/query.
123
+
-**Search Scope:** (Internet Only, Local Only, Internet then Local, Local then Internet) - Defines where the agent should search for information. "Local" options require backend configuration for local search.
124
+
64
125
## How the Backend Agent Works (High-Level)
65
126
66
127
The core of the backend is a LangGraph agent defined in `backend/src/agent/graph.py`. It follows these steps:
67
128
68
129

69
130
70
-
1.**Generate Initial Queries:** Based on your input, it generates a set of initial search queries using a Gemini model.
71
-
2.**Web Research:** For each query, it uses the Gemini model with the Google Search API to find relevant web pages.
72
-
3.**Reflection & Knowledge Gap Analysis:** The agent analyzes the search results to determine if the information is sufficient or if there are knowledge gaps. It uses a Gemini model for this reflection process.
73
-
4.**Iterative Refinement:** If gaps are found or the information is insufficient, it generates follow-up queries and repeats the web research and reflection steps (up to a configured maximum number of loops).
74
-
5.**Finalize Answer:** Once the research is deemed sufficient, the agent synthesizes the gathered information into a coherent answer, including citations from the web sources, using a Gemini model.
131
+
1.**Configure:** Reads settings from environment variables and per-request UI selections.
132
+
2.**Generate Initial Queries:** Based on your input and configured model, it generates initial search queries.
133
+
3.**Web/Local Research:** Depending on the `SEARCH_MODE`:
134
+
* Performs searches using the Google Search API (for internet results).
135
+
* Performs searches using the custom `LocalSearchTool` against configured domains (for local results).
136
+
* Combines results if applicable.
137
+
4.**Reflection & Knowledge Gap Analysis:** The agent analyzes the search results to determine if the information is sufficient or if there are knowledge gaps.
138
+
5.**Iterative Refinement:** If gaps are found, it generates follow-up queries and repeats the research and reflection steps.
139
+
6.**Finalize Answer:** Once research is sufficient, the agent synthesizes the information into a coherent answer with citations, using the configured answer model.
75
140
76
141
## Deployment
77
142
@@ -89,8 +154,12 @@ _Note: If you are not running the docker-compose.yml example or exposing the bac
89
154
```
90
155
**2. Run the Production Server:**
91
156
157
+
Adjust the `docker-compose.yml` or your deployment environment to include all necessary environment variables as described in the "Configuration" section.
158
+
Example:
92
159
```bash
93
-
GEMINI_API_KEY=<your_gemini_api_key> LANGSMITH_API_KEY=<your_langsmith_api_key> docker-compose up
160
+
# Ensure your .env file (if used by docker-compose) or environment variables are set
161
+
# e.g., GEMINI_API_KEY, LLM_PROVIDER, LLM_API_KEY, LANGSMITH_API_KEY (if LangSmith enabled), etc.
162
+
docker-compose up
94
163
```
95
164
96
165
Open your browser and navigate to `http://localhost:8123/app/` to see the application. The API will be available at `http://localhost:8123`.
@@ -101,7 +170,8 @@ Open your browser and navigate to `http://localhost:8123/app/` to see the applic
101
170
-[Tailwind CSS](https://tailwindcss.com/) - For styling.
102
171
-[Shadcn UI](https://ui.shadcn.com/) - For components.
103
172
-[LangGraph](https://github.com/langchain-ai/langgraph) - For building the backend research agent.
104
-
-[Google Gemini](https://ai.google.dev/models/gemini) - LLM for query generation, reflection, and answer synthesis.
173
+
- LLMs: [Google Gemini](https://ai.google.dev/models/gemini), and adaptable for others like [OpenRouter](https://openrouter.ai/), [DeepSeek](https://www.deepseek.com/).
174
+
- Search: Google Search API, Custom Local Network Search (Python `requests` & `BeautifulSoup`).
"description": "The LLM provider to use (e.g., 'gemini', 'openrouter', 'deepseek'). Environment variable: LLM_PROVIDER"
15
+
},
16
+
)
17
+
18
+
llm_api_key: Optional[str] =Field(
19
+
default=None,
20
+
metadata={
21
+
"description": "The API key for the selected LLM provider. Environment variable: LLM_API_KEY"
22
+
},
23
+
)
24
+
25
+
openrouter_model_name: Optional[str] =Field(
26
+
default=None,
27
+
metadata={
28
+
"description": "The specific OpenRouter model string (e.g., 'anthropic/claude-3-haiku'). Environment variable: OPENROUTER_MODEL_NAME"
29
+
},
30
+
)
31
+
32
+
deepseek_model_name: Optional[str] =Field(
33
+
default=None,
34
+
metadata={
35
+
"description": "The specific DeepSeek model (e.g., 'deepseek-chat'). Environment variable: DEEPSEEK_MODEL_NAME"
36
+
},
37
+
)
38
+
11
39
query_generator_model: str=Field(
12
-
default="gemini-2.0-flash",
40
+
default="gemini-1.5-flash",
13
41
metadata={
14
-
"description": "The name of the language model to use for the agent's query generation."
42
+
"description": "The name of the language model to use for the agent's query generation. Interpreted based on llm_provider (e.g., 'gemini-1.5-flash' for Gemini, part of model string for OpenRouter). Environment variable: QUERY_GENERATOR_MODEL"
15
43
},
16
44
)
17
45
18
46
reflection_model: str=Field(
19
-
default="gemini-2.5-flash-preview-04-17",
47
+
default="gemini-1.5-flash",
20
48
metadata={
21
-
"description": "The name of the language model to use for the agent's reflection."
49
+
"description": "The name of the language model to use for the agent's reflection. Interpreted based on llm_provider. Environment variable: REFLECTION_MODEL"
22
50
},
23
51
)
24
52
25
53
answer_model: str=Field(
26
-
default="gemini-2.5-pro-preview-05-06",
54
+
default="gemini-1.5-pro",
27
55
metadata={
28
-
"description": "The name of the language model to use for the agent's answer."
56
+
"description": "The name of the language model to use for the agent's answer. Interpreted based on llm_provider. Environment variable: ANSWER_MODEL"
29
57
},
30
58
)
31
59
@@ -39,6 +67,44 @@ class Configuration(BaseModel):
39
67
metadata={"description": "The maximum number of research loops to perform."},
40
68
)
41
69
70
+
langsmith_enabled: bool=Field(
71
+
default=True,
72
+
metadata={
73
+
"description": "Controls LangSmith tracing. Set to false to disable. If true, ensure LANGCHAIN_API_KEY and other relevant LangSmith environment variables (LANGCHAIN_TRACING_V2, LANGCHAIN_ENDPOINT, LANGCHAIN_PROJECT) are set. Environment variable: LANGSMITH_ENABLED"
74
+
},
75
+
)
76
+
77
+
enable_local_search: bool=Field(
78
+
default=False,
79
+
metadata={
80
+
"description": "Enable or disable local network search functionality. Environment variable: ENABLE_LOCAL_SEARCH"
81
+
},
82
+
)
83
+
84
+
local_search_domains: List[str] =Field(
85
+
default_factory=list, # Use default_factory for mutable types like list
86
+
metadata={
87
+
"description": "Comma-separated list of base URLs or domains for local network search (e.g., 'http://intranet.mycompany.com,http://docs.internal'). Environment variable: LOCAL_SEARCH_DOMAINS"
0 commit comments