The WebMind is an intelligent search assistant designed to provide direct answers to user queries by leveraging web content and AI analysis. Unlike traditional search engines that return a list of links, this tool actively scrapes the content of top search results and uses a locally hosted Large Language Model (LLM) via Ollama to extract and present the most relevant answer along with its source URL. This approach ensures privacy and control over the AI model used.
- User Interface: Built with Streamlit, allowing users to input queries and view results.
- Web Search: Utilizes DuckDuckGo to retrieve a list of relevant URLs.
- Web Scraping: Employs Scrapy and Selenium to fetch and render web page content.
- AI Analysis: Uses Ollama to host and query the LLM for content analysis.
- Orchestration: Managed by an agent that coordinates the entire process and provides real-time status updates.
- Python 3.8+
- Dependencies: See
requirements.txt - Ollama service running with model
gemma3:4b(configurable) - Chrome browser and matching ChromeDriver
To install Python dependencies:
pip install -r requirements.txtTo use a different Ollama model, update CHAT_MODEL in config.py. For integrating another LLM, modify the llm_service.py module accordingly.
-
Clone the repository:
git clone https://github.com/Burhanuddin-2001/WebMind.git cd advanced-ai-search-tool -
Create and activate a virtual environment:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
ChromeDriver setup:
- Download the ChromeDriver version that matches your Chrome browser from ChromeDriver downloads.
- Place the downloaded
chromedriverexecutable in a directory of your choice and updateSELENIUM_DRIVER_PATHin the.envfile to point to its location.
-
Configure environment variables:
-
Create a
.envfile in the root directory:SELENIUM_DRIVER_PATH=path/to/chromedriver CHAT_MODEL=gemma3:4b -
Ensure the Ollama service is running and accessible.
-
-
Launch the Streamlit app:
streamlit run streamlit_app.py
-
Enter your query in the web interface and click "Search".
-
View the results, including real-time status updates and the final answer with source URL if found.
Contributions are welcome! To contribute:
- Fork the repository.
- Create a new branch for your feature or bugfix.
- Commit your changes and push to your fork.
- Open a pull request to the main repository.
Please follow the contribution guidelines if available.
This project is licensed under the MIT License.
- Author: Burhanuddin
- LinkedIn: www.linkedin.com/in/burhanuddin-cyber