Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/decisions/0003-xapi-event-tracking.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Different AI workflows exhibit different interaction patterns (one-shot vs. conv
Decision
********

AI workflow executions will emit xAPI-compliant events using Open edX’s existing eventtracking and event-routing-backends infrastructure.
AI workflow executions will emit xAPI-compliant events using the Open edX platform’s existing eventtracking and event-routing-backends infrastructure.

* Events are emitted via `eventtracking.tracker.emit()` and transformed to xAPI by ERB when Aspects is installed.
* Direct dependencies on `eventtracking` and `event-routing-backends` are used (no `edxapp_wrapper` abstraction).
Expand Down
12 changes: 6 additions & 6 deletions docs/how-tos/mcp_example_server.rst
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,7 @@ Create a file named ``run_server.py``:
Step 3: Expose the Server with ngrok
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

Since MCP clients (including OpenEdx AI Extensions) need to access the server via a public URL, you must expose your local server using ngrok:
Since MCP clients (including Open edX AI Extensions) need to access the server via a public URL, you must expose your local server using ngrok:

.. code-block:: bash

Expand All @@ -157,7 +157,7 @@ Create a file named ``client_example.py`` to test your server:
Example MCP client using LiteLLM

This demonstrates how to connect to an MCP server and use its tools
from a language model. This is similar to how OpenEdx AI Extensions
from a language model. This is similar to how Open edX AI Extensions
will interact with your MCP servers.
"""
import asyncio
Expand Down Expand Up @@ -224,10 +224,10 @@ Follow these steps to test the complete MCP workflow:

python client_example.py

Testing with OpenEdx AI Extensions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Testing with Open edX AI Extensions
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

To integrate this example server with OpenEdx AI Extensions:
To integrate this example server with Open edX AI Extensions:

1. **Configure the MCP server** in your Django settings:

Expand Down Expand Up @@ -317,7 +317,7 @@ Remember:
Further Reading
---------------

- :doc:`mcp_integration` - Main MCP integration guide for OpenEdx AI Extensions
- :doc:`mcp_integration` - Main MCP integration guide for Open edX AI Extensions
- `FastMCP Documentation <https://github.com/jlowin/fastmcp>`_ - Complete FastMCP framework documentation
- `MCP Specification <https://modelcontextprotocol.io/>`_ - Official Model Context Protocol specification

Expand Down
8 changes: 4 additions & 4 deletions docs/how-tos/mcp_integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ Model Context Protocol (MCP) Integration
=========================================

.. warning::
**Important: OpenEdx AI Extensions acts as an MCP CLIENT only**
**Important: Open edX AI Extensions acts as an MCP CLIENT only**

This application **does not run or host MCP servers**. It only connects to external MCP servers as a client.

Expand All @@ -13,13 +13,13 @@ Model Context Protocol (MCP) Integration
Overview
--------

The Model Context Protocol (MCP) is an open standard that enables AI assistants to securely interact with external tools and data sources. OpenEdx AI Extensions integrates with MCP by acting as a **client** that can connect to external MCP servers, allowing your AI workflows to leverage custom tools and capabilities.
The Model Context Protocol (MCP) is an open standard that enables AI assistants to securely interact with external tools and data sources. Open edX AI Extensions integrates with MCP by acting as a **client** that can connect to external MCP servers, allowing your AI workflows to leverage custom tools and capabilities.

Key Concepts
~~~~~~~~~~~~

- **MCP Server**: An external service that exposes tools and resources via the MCP protocol. You must deploy and manage these servers independently.
- **MCP Client**: OpenEdx AI Extensions acts as a client, connecting to your MCP servers and making their tools available to AI workflows.
- **MCP Client**: Open edX AI Extensions acts as a client, connecting to your MCP servers and making their tools available to AI workflows.
- **MCP Tools**: Functions exposed by MCP servers that the AI model can call to perform specific operations (e.g., data retrieval, computations, integrations).

Architecture
Expand All @@ -28,7 +28,7 @@ Architecture
.. code-block:: text

┌────────────────────────────────────────────┐
OpenEdX AI Extensions
Open edX AI Extensions │
│ (MCP Client / Orchestrator) │
│ │
│ ┌──────────────────────┐ │
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
openedx_ai_extensions
=====================

A experimental plugin for Open edX designed to explore AI extensibility
A experimental plugin for the Open edX platform designed to explore AI extensibility

Contents:

Expand Down