|
| 1 | +# MCP Oracle OCI integrations |
| 2 | +This repository contains code and examples to help in the following tasks: |
| 3 | +* **Develop** MCP servers in **Python** |
| 4 | +* **Run** MCP servers on **Oracle OCI** |
| 5 | +* **Integrate** MCP servers with **AI Agents** |
| 6 | +* **Integrate** MCP servers with other **OCI resources** (ADB, Select AI, ...) |
| 7 | +* **Integrate** MCP Servers running on OCI with AI Assistants like **ChatGPT**, Claude.ai, MS Copilot |
| 8 | +* **Integrate** MCP Servers with OCI **APM** for **Observability** |
| 9 | + |
| 10 | + |
| 11 | + |
| 12 | +## What is MCP? |
| 13 | +**MCP (Model Context Protocol)** is an **open-source standard** that lets AI models (e.g. LLMs or agents) connect bidirectionally with external tools, data sources, and services via a unified interface. |
| 14 | + |
| 15 | +It replaces the “N×M” integration problem (where each AI × data source requires custom code) with one standard protocol. |
| 16 | + |
| 17 | +MCP supports **dynamic discovery** of available tools and context, enabling: |
| 18 | +* AI Assistants to get access to relevant information, available in Enterprise Knowledge base. |
| 19 | +* Agents to reason and chain actions across disparate systems. |
| 20 | + |
| 21 | +It’s quickly gaining traction: major players like OpenAI, Google DeepMind, Oracle are adopting it to make AI systems more composable and interoperable. |
| 22 | + |
| 23 | +In today’s landscape of agentic AI, MCP is critical because it allows models to act meaningfully in real-world systems rather than remaining isolated black boxes. |
| 24 | + |
| 25 | +## Develop MCP Servers in Python |
| 26 | +The easiest way is to use the [FastMCP](https://gofastmcp.com/getting-started/welcome) library. |
| 27 | + |
| 28 | +**Examples**: |
| 29 | +* in [Minimal MCP Server](./minimal_mcp_server.py) you'll find a **good, minimal example** of a server exposing two tools, with the option to protect them using [JWT](https://www.jwt.io/introduction#what-is-json-web-token). |
| 30 | + |
| 31 | +If you want to start with **something simpler**, have a look at [how to start developing MCP](./how_to_start_mcp.md). It is simpler, with no support for JWT tokens. |
| 32 | + |
| 33 | +## How to test |
| 34 | +If you want to quickly test the MCP server you developed (or the minimal example provided here) you can use the [Streamlit UI](./ui_mcp_agent.py). |
| 35 | + |
| 36 | +In the Streamlit application, you can: |
| 37 | +* Specify the URL of the MCP server (default is in [mcp_servers_config.py](./mcp_servers_config.py)) |
| 38 | +* Select one of models available in OCI Generative AI |
| 39 | +* Test making questions answered using the tools exposed by the MCP server. |
| 40 | + |
| 41 | +In [llm_with_mcp.py](./llm_with_mcp.py) there is the complete implementation of the **tool-calling** loop. |
| 42 | + |
| 43 | +## Semantic Search |
| 44 | +In this repository there is a **complete implementation of an MCP server** implementing **Semantic Search** on top of **Oracle 23AI**. |
| 45 | +To use it, you need only: |
| 46 | +* To load the documents in the Oracle DB |
| 47 | +* To put the right configuration, to connect to DB, in config_private.py. |
| 48 | + |
| 49 | +The code is available [here](./mcp_semantic_search_with_iam.py). |
| 50 | + |
| 51 | +Access to Oracle 23AI Vector Search is through the **new** [langchain-oci integration library](https://github.com/oracle/langchain-oracle) |
| 52 | + |
| 53 | +## Adding security |
| 54 | +If you want to put your **MCP** server in production, you need to add security, at several levels. |
| 55 | + |
| 56 | +Just to mention few important points: |
| 57 | +* You don't want to expose directly the MCP server over Internet |
| 58 | +* The communication with the MCP server must be encrypted (i.e: using TLS) |
| 59 | +* You want to authenticate and authorize the clients |
| 60 | + |
| 61 | +Using **OCI services** there are several things you can do to get the right level of security: |
| 62 | +* You can put an **OCI API Gateway** in front, using it as TLS termination |
| 63 | +* You can enable authentication using **JWT** tokens |
| 64 | +* You can use **OCI IAM** to generate **JWT** tokens |
| 65 | +* You can use OCI network security |
| 66 | + |
| 67 | +More details in a dedicate page. |
| 68 | + |
| 69 | +## Integrate MCP Semantic Search with ChatGPT |
| 70 | +If you deploy the [MCP Semantic Search](./mcp_semantic_search_with_iam.py) server you can test the integration with **ChatGPT** in **Developer Mode**. It provides a **search** tool, compliant with **OpenAI** specs. |
| 71 | + |
| 72 | +Soon, we'll add a server fully compliant with **OpenAI** specifications, that can be integrated in **Deep Research**. The server must implement two methods (**search** and **fetch**) with a different behaviour, following srictly OpenAI specs. |
| 73 | + |
| 74 | +An initial implementation is available [here](./mcp_deep_research_with_iam.py) |
| 75 | + |
| 76 | +Details available [here](./integrate_chatgpt.md) |
| 77 | + |
| 78 | +## Integrate OCI ADB Select AI |
| 79 | +Another option is to use an MCP server to be able to integrate OCI **SelectAI** in ChatGPT or other assistants supporting MCP. |
| 80 | +In this way you have an option to do full **Text2SQL** search, over your database schema. Then, the AI assistant can process your retrieved data. |
| 81 | + |
| 82 | +An example is [here](./mcp_selectai.py) |
| 83 | + |
| 84 | + |
0 commit comments