Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
35 changes: 35 additions & 0 deletions ai/generative-ai-service/oci_langflow/LICENSE.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
Copyright (c) 2024 Oracle and/or its affiliates.

The Universal Permissive License (UPL), Version 1.0

Subject to the condition set forth below, permission is hereby granted to any
person obtaining a copy of this software, associated documentation and/or data
(collectively the "Software"), free of charge and under any and all copyright
rights in the Software, and any and all patent rights owned or freely
licensable by each licensor hereunder covering either (i) the unmodified
Software as contributed to or provided by such licensor, or (ii) the Larger
Works (as defined below), to deal in both

(a) the Software, and
(b) any piece of software and/or hardware listed in the lrgrwrks.txt file if
one is included with the Software (each a "Larger Work" to which the Software
is contributed by such licensors),

without restriction, including without limitation the rights to copy, create
derivative works of, display, perform, and distribute the Software and make,
use, sell, offer for sale, import, export, have made, and have sold the
Software and the Larger Work(s), and to sublicense the foregoing rights on
either these or other terms.

This license is subject to the following condition:
The above copyright notice and either this complete permission notice or at
a minimum a reference to the UPL must be included in all copies or
substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
25 changes: 15 additions & 10 deletions ai/generative-ai-service/oci_langflow/README.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,21 @@
# Integration of OCI Generative AI in Langflow

This repository contains all the code for a prototype of the integration of OCI Generative AI in Langflow
[![License: UPL](https://img.shields.io/badge/license-UPL-green)](https://img.shields.io/badge/license-UPL-green) [![Quality gate](https://sonarcloud.io/api/project_badges/quality_gate?project=oracle-devrel_test)](https://sonarcloud.io/dashboard?id=oracle-devrel_test)

Reviewed: 23.01.2025

# **Link to code**
[Code](https://github.com/luigisaetta/oci_langflow/tree/main)
## Introduction
This repository contains the code for a prototype of the integration of OCI Generative AI in Langflow

# License

Reviewed: 25.06.2025

## Security
Please consult the [security guide](./SECURITY.md) for our responsible security
vulnerability disclosure process.

## License
Copyright (c) 2024 Oracle and/or its affiliates.

Licensed under the Universal Permissive License (UPL), Version 1.0.

See [LICENSE](https://github.com/oracle-devrel/technology-engineering/blob/main/LICENSE) for more details.

See [LICENSE](LICENSE.txt) for more details.

ORACLE AND ITS AFFILIATES DO NOT PROVIDE ANY WARRANTY WHATSOEVER, EXPRESS OR IMPLIED, FOR ANY SOFTWARE, MATERIAL OR CONTENT OF ANY KIND CONTAINED OR PRODUCED WITHIN THIS REPOSITORY, AND IN PARTICULAR SPECIFICALLY DISCLAIM ANY AND ALL IMPLIED WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE. FURTHERMORE, ORACLE AND ITS AFFILIATES DO NOT REPRESENT THAT ANY CUSTOMARY SECURITY REVIEW HAS BEEN PERFORMED WITH RESPECT TO ANY SOFTWARE, MATERIAL OR CONTENT CONTAINED OR PRODUCED WITHIN THIS REPOSITORY. IN ADDITION, AND WITHOUT LIMITING THE FOREGOING, THIRD PARTIES MAY HAVE POSTED SOFTWARE, MATERIAL OR CONTENT TO THIS REPOSITORY WITHOUT ANY REVIEW. USE AT YOUR OWN RISK.
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
"""
Custom integration with Langflow and OCI Embeddings Model

Author: L. Saetta (Oracle)

"""

from langchain_community.embeddings import OCIGenAIEmbeddings

from langflow.base.models.model import LCModelComponent
from langflow.io import DropdownInput, StrInput, Output, SecretStrInput
from langflow.field_typing import Embeddings

class OCIEmbeddingsComponent(LCModelComponent):
"""
This class integrates the OCI Embeddings Model with Langflow.

Notes:
* Security: for now API_KEY, set your key-pair in $HOME/.oci

"""

display_name = "OCI Cohere Embeddings"
description = "Generate Embeddings using OCI Cohere models."

inputs = [
# example of dropdown
DropdownInput(
name="auth_type",
display_name="auth_type",
info="The type of auth_type to use for the chat model",
advanced=True,
options=[
"API_KEY",
"RESOURCE_PRINCIPAL",
],
value="API_KEY",
),
DropdownInput(
name="model",
display_name="Model",
advanced=True,
options=[
"cohere.embed-english-v3.0",
"cohere.embed-multilingual-v3.0",
],
value="cohere.embed-english-v3.0",
),
StrInput(
name="service_endpoint",
display_name="Service Endpoint",
info="OCI Service Endpoint URL",
required=True,
),
SecretStrInput(
name="compartment_id",
display_name="Compartment ID",
info="OCI Compartment OCID",
),
]

outputs = [
Output(display_name="Embeddings", name="embeddings", method="build_embeddings"),
]

def build_model(self) -> Embeddings:
"""
build the embeddings model
"""
return self.build_embeddings()

def build_embeddings(self) -> Embeddings:
"""
build the embeddings model
"""
# default truncate strategy is END
return OCIGenAIEmbeddings(
auth_type=self.auth_type,
model_id=self.model,
service_endpoint=self.service_endpoint,
compartment_id=self.compartment_id,
)
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
"""
Custom integration with Langflow and OCI Chat Model

Author: L. Saetta (Oracle)

This is part of demo code, in a real project you need to customise to fit your needs:
* temperature
* max_tokens
* models list
"""

from langflow.base.models.model import LCModelComponent
from langflow.inputs import StrInput, DropdownInput, SecretStrInput
from langflow.field_typing import LanguageModel
from langchain_community.chat_models.oci_generative_ai import ChatOCIGenAI


class OCIChatComponent(LCModelComponent):
"""
This class integrates the OCI Chat Model with Langflow.

Notes:
* Security: for now API_KEY, set your key-pair in $HOME/.oci

"""

display_name = "OCI Chat Model"
description = "OCI's Generative AI Chat Model."

inputs = [
*LCModelComponent._base_inputs,
DropdownInput(
name="auth_type",
display_name="auth_type",
info="The type of auth_type to use for the chat model",
advanced=True,
options=[
"API_KEY",
"RESOURCE_PRINCIPAL",
],
value="API_KEY",
),
DropdownInput(
name="model_id",
display_name="model_id",
info="The model_id to use for the chat model",
advanced=True,
options=[
"meta.llama-3.3-70b-instruct",
"cohere.command-r-plus-08-2024",
"meta.llama-3.1-405b-instruct",
],
value="meta.llama-3.1-70b-instruct",
),
StrInput(
name="service_endpoint",
display_name="Service Endpoint",
info="OCI Service Endpoint URL",
value="https://inference.generativeai.eu-frankfurt-1.oci.oraclecloud.com",
),
SecretStrInput(
name="compartment_id",
display_name="Compartment ID",
info="OCI Compartment OCID",
),
]

def build_model(self) -> LanguageModel:
chat_model = ChatOCIGenAI(
auth_type=self.auth_type,
model_id=self.model_id,
service_endpoint=self.service_endpoint,
compartment_id=self.compartment_id,
model_kwargs={"temperature": 0.1, "max_tokens": 1024},
)

return chat_model
Original file line number Diff line number Diff line change
@@ -0,0 +1,111 @@
"""
Custom integration with Langflow and OCI Vector Store

Author: L. Saetta (Oracle)

"""

import oracledb
from langchain_community.vectorstores.oraclevs import OracleVS
from langchain_community.vectorstores.utils import DistanceStrategy

from langflow.base.vectorstores.model import (
LCVectorStoreComponent,
check_cached_vector_store,
)
from langflow.helpers.data import docs_to_data
from langflow.io import (
HandleInput,
IntInput,
StrInput,
SecretStrInput,
MessageTextInput,
)
from langflow.schema import Data


class OCIVectorStoreComponent(LCVectorStoreComponent):
"""
Wrapper for the OCI Vector Store
"""

display_name = "OCIVectorStore"
description = "OCI Vector Store based on 23AI"
name = "ocivector"

inputs = [
SecretStrInput(name="db_user", required=True),
SecretStrInput(name="db_pwd", required=True),
SecretStrInput(name="dsn", required=True),
SecretStrInput(name="wallet_dir", required=True),
SecretStrInput(name="wallet_pwd", required=True),
StrInput(name="collection_name", display_name="Table", required=True),
# this way we can handle the input for the search and it can be connected
# in the flow
MessageTextInput(
name="search_query",
display_name="search_query",
info="Enter the search query.",
),
*LCVectorStoreComponent.inputs,
HandleInput(
name="embedding", display_name="Embedding", input_types=["Embeddings"]
),
IntInput(
name="number_of_results",
display_name="Number of Results",
info="Number of results to return.",
value=4,
advanced=True,
),
]

def handle_credentials(self) -> dict:
"""
this function organizes the parameters to connect
to DB
"""
_connect_args_vector = {
"user": self.db_user,
"password": self.db_pwd,
"dsn": self.dsn,
"config_dir": self.wallet_dir,
"wallet_location": self.wallet_dir,
"wallet_password": self.wallet_pwd,
}
return _connect_args_vector

@check_cached_vector_store
def build_vector_store(self) -> OracleVS:
connect_args_vector = self.handle_credentials()

conn = oracledb.connect(**connect_args_vector)

v_store = OracleVS(
client=conn,
table_name=self.collection_name,
distance_strategy=DistanceStrategy.COSINE,
embedding_function=self.embedding,
)
return v_store

def search_documents(self) -> list[Data]:
"""
no changes needed here
"""
vector_store = self.build_vector_store()

if (
self.search_query
and isinstance(self.search_query, str)
and self.search_query.strip()
):
docs = vector_store.similarity_search(
query=self.search_query,
k=self.number_of_results,
)

data = docs_to_data(docs)
self.status = data
return data
return []
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.