|
| 1 | += RAGStack Python packages |
| 2 | + |
| 3 | +RAGStack comes with a set of Python packages that provide the necessary tools to implement the RAG pattern in your applications. |
| 4 | + |
| 5 | +. `ragstack-ai`: All-in-one package that contains all components supported by RAGStack. While this is the most convenient package to use, it may be heavier than necessary for some use cases. |
| 6 | +. `ragstack-ai-langchain`: This package is meant for users who want to use RAGStack with the LangChain framework. |
| 7 | +. `ragstack-ai-llamaindex`: This package is meant for users who want to use RAGStack with the LlamaIndex framework. |
| 8 | +. `ragstack-ai-colbert`: This package contains the implementation of the ColBERT retrieval. |
| 9 | + |
| 10 | + |
| 11 | +== Supported integrations for `ragstack-ai-langchain` |
| 12 | +The `ragstack-ai-langchain` package includes the minimum set of dependencies for using LangChain with {astra_db}. |
| 13 | +LLMs, embeddings, and third-party providers are not included in this package by default, expect for OpenAI and Azure OpenAI. |
| 14 | + |
| 15 | +To use LLMs, embeddings, or third-party providers, you can leverage `ragstack-ai-langchain` extras: |
| 16 | + |
| 17 | +. `ragstack-ai-langchain[google]` lets you work with https://python.langchain.com/docs/integrations/platforms/google[Google Vertex AI and Google Gemini API]{external-link-icon}. |
| 18 | +. `ragstack-ai-langchain[nvidia]` lets you work with https://python.langchain.com/docs/integrations/providers/nvidia/[NVIDIA hosted API endpoints for NVIDIA AI Foundation Models]{external-link-icon}. |
| 19 | + |
| 20 | +Additional LangChain packages should work out of the box, although you need to manage the packages and their dependencies yourself. |
| 21 | + |
| 22 | + |
| 23 | +== Supported integrations for `ragstack-ai-llamaindex` |
| 24 | + |
| 25 | +The `ragstack-ai-llamaindex` package includes the minimum set of dependencies for using LlamaIndex with {astra_db}. |
| 26 | +LLMs, embeddings, and third-party providers are not included in this package by default, except for OpenAI. |
| 27 | + |
| 28 | +To use LLMs, embeddings, or third-party providers, you can leverage `ragstack-ai-llamaindex` extras: |
| 29 | + |
| 30 | +. `ragstack-ai-llamaindex[google]` lets you work with https://docs.llamaindex.ai/en/stable/examples/llm/vertex/[Google Vertex AI]{external-link-icon} and https://docs.llamaindex.ai/en/stable/examples/llm/gemini/[Google Gemini API]{external-link-icon}. |
| 31 | +. `ragstack-ai-llamaindex[azure]` lets you work with https://docs.llamaindex.ai/en/stable/examples/llm/azure_openai/[Azure OpenAI]{external-link-icon}. |
| 32 | +. `ragstack-ai-llamaindex[bedrock]` lets you work with https://docs.llamaindex.ai/en/stable/examples/llm/bedrock/[AWS Bedrock]{external-link-icon}. |
| 33 | + |
| 34 | +Additional LLamaIndex packages should work out of the box, although you need to manage the packages and their dependencies yourself. |
| 35 | + |
| 36 | + |
| 37 | +== Colbert with `ragstack-ai-langchain` and `ragstack-ai-llamaindex` |
| 38 | + |
| 39 | +The `colbert` module provides a vanilla implementation for COLBert retrieval. It is not tied to any specific framework and can be used with any of the RAGStack packages. |
| 40 | + |
| 41 | +If you want to use COLBert with LangChain or LLamaIndex, you can use the following the extras: |
| 42 | + |
| 43 | +. `ragstack-ai-langchain[colbert]` |
| 44 | +. `ragstack-ai-llamaindex[colbert]` |
0 commit comments