This repository contains code and examples to help in the following tasks:
- Develop MCP servers in Python
- Run MCP servers on Oracle OCI
- Integrate MCP servers with AI Agents
- Integrate MCP servers with other OCI resources (ADB, Select AI, ...)
- Integrate MCP Servers running on OCI with AI Assistants like ChatGPT, Claude.ai, MS Copilot
- Integrate MCP Servers with OCI APM for Observability
- how-to create a docker image for your MCP server
Author: L. Saetta
Reviewed: 27.10.2025
MCP (Model Context Protocol) is an open-source standard that lets AI models (e.g. LLMs or agents) connect bidirectionally with external tools, data sources, and services via a unified interface.
It replaces the “N×M” integration problem (where each AI × data source requires custom code) with one standard protocol.
MCP supports dynamic discovery of available tools and context, enabling:
- AI Assistants to get access to relevant information, available in Enterprise Knowledge base.
- Agents to reason and chain actions across disparate systems.
It’s quickly gaining traction: major players like OpenAI, Google DeepMind, Oracle are adopting it to make AI systems more composable and interoperable.
In today’s landscape of agentic AI, MCP is critical because it allows models to act meaningfully in real-world systems rather than remaining isolated black boxes.
The easiest way is to use the FastMCP library.
Examples:
- in Minimal MCP Server you'll find a good, minimal example of a server exposing two tools, with the option to protect them using JWT.
If you want to start with something simpler, have a look at how to start developing MCP. It is simpler, with no support for JWT tokens.
If you want to quickly test the MCP server you developed (or the minimal example provided here) you can use the Streamlit UI.
In the Streamlit application, you can:
- Specify the URL of the MCP server (default is in mcp_servers_config.py)
- Select one of models available in OCI Generative AI
- Test making questions answered using the tools exposed by the MCP server.
In llm_with_mcp.py there is the complete implementation of the tool-calling loop.
In this repository there is a complete implementation of an MCP server implementing Semantic Search on top of Oracle 23AI. To use it, you need only:
- To load the documents in the Oracle DB
- To put the right configuration, to connect to DB, in config_private.py.
The code is available here.
Access to Oracle 23AI Vector Search is through the new langchain-oci integration library
If you want to put your MCP server in production, you need to add security, at several levels.
Just to mention few important points:
- You don't want to expose directly the MCP server over Internet
- The communication with the MCP server must be encrypted (i.e: using TLS)
- You want to authenticate and authorize the clients
Using OCI services there are several things you can do to get the right level of security:
- You can put an OCI API Gateway in front, using it as TLS termination
- You can enable authentication using JWT tokens
- You can use OCI IAM to generate JWT tokens
- You can use OCI network security
More details in a dedicated page.
If you deploy the MCP Semantic Search server you can test the integration with ChatGPT in Developer Mode. It provides a search tool, compliant with OpenAI specs.
Soon, we'll add a server fully compliant with OpenAI specifications, that can be integrated in Deep Research. The server must implement two methods (search and fetch) with a different behaviour, following srictly OpenAI specs.
An initial implementation is available here
Details available here
Another option is to use an MCP server to be able to integrate OCI SelectAI in ChatGPT or other assistants supporting MCP. In this way you have an option to do full Text2SQL search, over your database schema. Then, the AI assistant can process your retrieved data.
An example is here
For Select AI configuration, see here
Another use-case demonstrated in this set of demos is leveraging an AI Assistant powered by MCP servers to analyze the OCI tenant consumption in a natural and interactive way.
Using the MCP Consumption Server, you can explore various dimensions of consumption and ask questions such as:
- List the top 10 services by total amount for a given period (start_date, end_date).
- List the top 10 compartments by total consumption.
- For a specific service (or list of services), show the consumption breakdown across the top 5 compartments.
The key advantage of this approach is that you don’t need to export or replicate data into a Data Warehouse (DWH) — all information is retrieved directly from the OCI Usage API in real time.
How to Use:
- Configure your OCI credentials.
- Start the MCP Consumption Server
- Launch the AI Assistant
- Point the Assistant to the MCP URL (or to your MCP Aggregator).
For more information, see here
