Skip to content

Revert "rag"#8

Merged
david20571015 merged 1 commit intomainfrom
revert-7-rag
Mar 10, 2025
Merged

Revert "rag"#8
david20571015 merged 1 commit intomainfrom
revert-7-rag

Conversation

@david20571015
Copy link
Contributor

Reverts #7

This reverts commit 9c703bd.
Copilot AI review requested due to automatic review settings March 10, 2025 05:38
@david20571015 david20571015 merged commit 3889924 into main Mar 10, 2025
1 check passed
@david20571015 david20571015 deleted the revert-7-rag branch March 10, 2025 05:38
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @david20571015, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

Summary of Changes

This pull request reverts the changes introduced in #7, effectively removing the 'rag' service and associated configurations. The changes include removing the RAG-related files, reverting modifications to existing files, and updating the configuration to reflect the removal of the RAG service. The configuration now includes search and summarize services.

Highlights

  • Revert RAG Service: The primary focus is to remove the RAG (Retrieval-Augmented Generation) service from the llm-backend.
  • Configuration Changes: The configuration files are updated to remove RAG-specific settings and introduce search and summarize services.
  • Dockerfile and Docker Compose Updates: The Dockerfile and docker-compose.yaml files are modified to reflect the removal of RAG dependencies and configurations.

Changelog

Click here to see the changelog
  • .dockerignore
    • Removed entries related to RAG, such as configs/, llm_backend/protos/, and other related files/directories.
  • .gitignore
    • Added configs/* to ignore all files in the configs directory except for configs/example.toml.
  • Dockerfile
    • Removed the entire Dockerfile, effectively reverting to a previous state without RAG dependencies.
  • README.md
    • Updated the usage section to refer to configs/example.toml for configuration options and removed the client example.
  • configs/config.toml
    • Removed the entire config.toml file, which contained RAG-specific configurations.
  • configs/example.toml
    • Added a new example configuration file with settings for search and summarize services.
  • docker-compose.yaml
    • Removed the entire docker-compose.yaml file, reverting to a previous state without RAG-specific services.
  • llm_backend/init.py
    • Removed RAG-related imports and service setup, and added setup for search and summarize services.
    • Introduced ServiceConfig to manage configurations for search and summarize services.
  • llm_backend/rag/init.py
    • Removed all content related to the RAG service.
  • llm_backend/rag/content_formatters.py
    • Removed all content related to the RAG service.
  • llm_backend/rag/service.py
    • Removed all content related to the RAG service.
  • llm_backend/rag/workflow.py
    • Removed all content related to the RAG service.
  • llm_backend/search/init.py
    • Added search service components.
  • llm_backend/search/config.py
    • Added configuration settings for the search service.
  • llm_backend/search/service.py
    • Added the search service implementation.
  • llm_backend/summarize/init.py
    • Added summarize service components.
  • llm_backend/summarize/config.py
    • Added configuration settings for the summarize service.
  • llm_backend/summarize/service.py
    • Added the summarize service implementation.
  • llm_backend/utils.py
    • Added utility function contains_placeholder for validating template strings.
  • protos
    • Subproject commit updated to a6ed1e50003357b55e503ed639e77a15f296da6e.
  • pyproject.toml
    • Removed llama-index-core dependency.
  • scripts/client.py
    • Removed the client script.
  • scripts/serve.py
    • Updated the default config path to configs/example.toml and replaced RAG service setup with search and summarize service setup.
  • uv.lock
    • Removed revision = 1 and llama-index-core dependency.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in issue comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist is currently in preview and may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments to provide feedback.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.


Did you know?

TOML, the configuration file format used here, stands for Tom's Obvious, Minimal Language. It is designed to be easy to read due to its simple semantics.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@gemini-code-assist
Copy link

Warning

Gemini encountered an error creating the review. You can try again by commenting /gemini review.

Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copilot encountered an error and was unable to review this pull request. You can try again by re-requesting a review.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants