Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
135 changes: 135 additions & 0 deletions backend/openedx_ai_extensions/mcp/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
# Model Context Protocol (MCP) - Proof of Concept
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

best to write this as rst because that is what we use for read the docs


This directory contains a **proof of concept (MVP)** demonstrating how Model Context Protocol (MCP) works. This is **NOT** the full OpenEdx AI Extensions integration yet, but rather a working example to understand MCP concepts and test the infrastructure.

## Overview

This MVP demonstrates the MCP architecture and workflow using a simple dice-rolling example. The implementation consists of three main components:

1. **`server.py`** - A FastMCP server with an example tool (`roll_dice`)
2. **`run_server.py`** - Script to run the MCP server in HTTP mode
3. **`client_example.py`** - Example client showing how to interact with the MCP server

**Note:** The actual OpenEdx-specific tools and integration will be implemented in future iterations. This MVP focuses on validating the MCP infrastructure and communication patterns.

## Architecture

The current implementation uses:
- **FastMCP** - A framework for building MCP servers
- **Streamable HTTP** transport - Allows the server to be exposed via HTTP
- **LiteLLM** - For integrating the MCP server with language models

### Example Tool: `roll_dice`

The server currently implements a simple example tool that rolls dice. **This is a demonstration tool only** to show how MCP tools work. This pattern will later be extended to implement OpenEdx-specific operations like course management, user administration, content creation, etc.

## Setup

### Prerequisites

Install the required dependencies:

```bash
pip install fastmcp litellm openai
```

### Environment Variables

Set your OpenAI API key:

```bash
export OPENAI_API_KEY="your_openai_api_key_here"
```

Or update it directly in `client_example.py`.

## Running the Server

### Step 1: Start the MCP Server

Run the server locally on port 9001:

```bash
python run_server.py
```

The server will start and listen on `http://127.0.0.1:9001/mcp`.

### Step 2: Expose the Server with ngrok

Since the MCP protocol requires a publicly accessible endpoint for certain use cases, you need to expose your local server using ngrok:

```bash
# Install ngrok if you haven't already
# Visit https://ngrok.com/ to download and set up

# Expose port 9001
ngrok http 9001
```

ngrok will provide you with a public URL like:
```
https://abc123.ngrok-free.app
```

**Important**: Copy the ngrok URL (including the subdomain) as you'll need it for the client configuration.

### Step 3: Update the Client Configuration

Edit `client_example.py` and update the `server_url` with your ngrok URL:

```python
tools=[
{
"type": "mcp",
"server_label": "dice_server",
"server_url": "https://<your_ngrok_subdomain>.ngrok-free.app/mcp/",
"require_approval": "never",
},
],
```

Replace `<your_ngrok_subdomain>` with your actual ngrok subdomain (e.g., `abc123.ngrok-free.app`).

### Step 4: Run the Client Example

In a new terminal (while the server and ngrok are still running):

```bash
python client_example.py
```

## Testing Workflow

Here's the complete workflow for testing:

1. **Terminal 1** - Start the MCP server:
```bash
cd backend/openedx_ai_extensions/mcp
python run_server.py
```

2. **Terminal 2** - Expose with ngrok:
```bash
ngrok http 9001
```
Copy the ngrok URL from the output.

3. **Terminal 3** - Run the client:
```bash
# Update client_example.py with your ngrok URL first
python client_example.py
```

## Expected Output

When running the client, you should see:

1. List of available tools from the MCP server
2. The AI model response after using the `roll_dice` tool

Example:
```
Available resources: ['roll_dice']
Response: <LiteLLM response object with dice roll results>
```
3 changes: 3 additions & 0 deletions backend/openedx_ai_extensions/mcp/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
"""
Model Context Protocol (MCP) integration for OpenEdx AI Extensions
"""
57 changes: 57 additions & 0 deletions backend/openedx_ai_extensions/mcp/client_example.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# client_litellm.py
import asyncio
from fastmcp import Client
from litellm import responses
import os
import openai

from fastmcp.client.transports import StreamableHttpTransport
from fastmcp.server.auth.providers.bearer import RSAKeyPair
from pydantic import SecretStr

# Read key pair
with open("private.pem", "r") as private_key_file:
private_key_content = private_key_file.read()

with open("public.pem", "r") as public_key_file:
public_key_content = public_key_file.read()

os.environ["OPENAI_API_KEY"] = "your_openai_api_key_here"


openai_client = openai.OpenAI()
key_pair = RSAKeyPair(
private_key=SecretStr(private_key_content),
public_key=public_key_content
)

async def main():
# Generate JWT token
token = key_pair.create_token(
subject="[email protected]",
issuer="https://<your_ngrok_subdomain>.ngrok-free.app",
audience="dice_server",
scopes=["read", "write"]
)


response = responses(
model="gpt-4.1-nano",
reasoning=None,
tools=[
{
"type": "mcp",
"server_label": "dice_server",
"server_url": "https://ce888b1c33bf.ngrok-free.app/mcp/",
"require_approval": "never",
"authorization": token,
},
],
input="Roll a dice for me.",
)

print("Response:", response)


if __name__ == "__main__":
asyncio.run(main())
28 changes: 28 additions & 0 deletions backend/openedx_ai_extensions/mcp/private.pem
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
-----BEGIN PRIVATE KEY-----
MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQD1V0eiLh2hTTdU
NBdb8B/w4X2CVyDxeSYX2+Dpr4zQ1O8E+wQZcpw058S6WY34D2D0wq063pUeV3Mc
aw5jM/N2X/zYcc4Tq/0OeC+leErdFlvSLgBpFSsz41ntZmgpxAib86SH0J8oHPbG
3FBZvO5riD5YGl8grui+jGsjkgowA5C9iTXa3K8Spwk0VPaF2tYSTr7fa4h69yAV
Jaq+mRksQZThp89W9m+rdDfJlClOD56UgfWjTJYSztTtB5bYan5kIFaxietbgInS
t7AQ8PRb4+Wgx7ioyBpiLimQsAzmo875pH6C4EMe8+mKAP1xzWUUfb0rDnm1bAaG
P8L3QOn1AgMBAAECggEAPy8BJcVWoCnwq+DWezj2IOURm7WwqD+ZDd/0pnote8K6
4795qMwZao6d3ZgetdnQEHjqgBS+tpey74iqpeXFN6E0ztFQT0Sl4UoWizjVnuaZ
MyHhvS5UeAJ/MGKFROxVg0RWBRw3QO9kpoYqs1Gy3UKzO6FfCl3BVwF2vixoL0Da
tG7AiaWU/lpn9a2nNmUee0jQAE4zAR4Miv2FiCefE8zzVPrX8wm6b6bE+Km/no1V
cDK0A2fBlRUidt7HI49/ocLvWl8CB8FNRDbMu0xUAGNV5kr7TwigUz9hqagFqKVN
2oTqKmXge9y4lhi/u1cTMGToP8KLEWCNegIWISsLKQKBgQD/R2zZaKvVdeZHbzod
/UN9oPPl3TW/wiQvTF0HSYx4uYYG/1MSUOeMUsCstfX+8VLR/XClyBEUJQmfXvRV
QMS/yXtnDooMft8U+oLcITQA2qqIEKC/wvErVZFuGCnuWh2Z1KGnLBM0/Mxz7ud1
7jJ+bNiYHWR59pE9f1kwXoAvlwKBgQD2CKtJW+xAiofQDn3wGngGrtMy59h7pnSD
A+gHFVqpMYqi8BnAmuNlKS2K5znu5oGVsLLntqbJ5t+T1lCvZrT0biI8CBoLOxrn
VmaAuaJTzwOTmJJ02Qn/dm0nFbn5C1d7zFwMCr/oGMrmQkm0WS8cidq1C48UORyh
XsIrCwnkUwKBgQCaV8zwBeEexpHcTtuMljvgERhlukFtFyxZjIoShd1wgHsQb/8B
6/iTVtU3lyyMX8v7OoiJM1VgIKSYvwhrIyXR7ze1L4030N2ACZZlEY4nlg3VBniq
eGroEGxFbEat3b5X679xG6zhNJdI4QEAxGuzFIxALEU7mGBoFj8Oh5RpMQKBgG/N
LrDuUaRejyrPexEhpgs7ZIPMcUZ3NJjYrJaTcJhUB/DU9I7ek5jDpotpWZ0jKB2y
pwm+qXo0LMMMb6vVG0O7zFjFQbh6ylX3oCq8sHQvLSvj+CGbAv0Qfrd1GwZ9zepW
yjk6pUw9/+20j7Ohl1P7nOQKdaE19rmpysgugvc7AoGBAPkkzdvae+RrjZ4vb+Qf
PUzqk4LAhhbIm30fTAAo4Gv2u6DETPWHG3veQdRBY8m+cYUZu/ocqCTdTyX9LaP7
Wb821UQZCi+IDunI5ZRMk6r7yCbwdMJVbkKAIzN+ufKWtkNyq51U7/r7gn47mf+C
w11o8aDLkJBclJTIVIBVkBDC
-----END PRIVATE KEY-----
9 changes: 9 additions & 0 deletions backend/openedx_ai_extensions/mcp/public.pem
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
-----BEGIN PUBLIC KEY-----
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA9VdHoi4doU03VDQXW/Af
8OF9glcg8XkmF9vg6a+M0NTvBPsEGXKcNOfEulmN+A9g9MKtOt6VHldzHGsOYzPz
dl/82HHOE6v9DngvpXhK3RZb0i4AaRUrM+NZ7WZoKcQIm/Okh9CfKBz2xtxQWbzu
a4g+WBpfIK7ovoxrI5IKMAOQvYk12tyvEqcJNFT2hdrWEk6+32uIevcgFSWqvpkZ
LEGU4afPVvZvq3Q3yZQpTg+elIH1o0yWEs7U7QeW2Gp+ZCBWsYnrW4CJ0rewEPD0
W+PloMe4qMgaYi4pkLAM5qPO+aR+guBDHvPpigD9cc1lFH29Kw55tWwGhj/C90Dp
9QIDAQAB
-----END PUBLIC KEY-----
10 changes: 10 additions & 0 deletions backend/openedx_ai_extensions/mcp/run_server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
#!/usr/bin/env python
"""
Run the FastMCP server in stdio mode
"""
from server import mcp


if __name__ == "__main__":

mcp.run(transport="streamable-http")
48 changes: 48 additions & 0 deletions backend/openedx_ai_extensions/mcp/server.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
"""
MCP Server implementation using FastMCP for OpenEdx AI Extensions

This module provides a Model Context Protocol (MCP) server that exposes
tools and resources for AI assistants to interact with the OpenEdx system.
"""
import logging
from typing import Any
from fastmcp import FastMCP, Context
from starlette.responses import PlainTextResponse
from starlette.requests import Request
import random
from fastmcp.server.auth.providers.jwt import StaticTokenVerifier
from fastmcp.server.auth import BearerAuthProvider
from fastmcp.server.auth.providers.jwt import JWTVerifier
from fastmcp.server.dependencies import get_access_token, AccessToken
from fastmcp.server.middleware.logging import LoggingMiddleware
from fastmcp.server.middleware.error_handling import ErrorHandlingMiddleware

logger = logging.getLogger(__name__)

# Read public key for token validation
with open("public.pem", "r") as public_key_file:
public_key_content = public_key_file.read()

# Configure authentication provider
auth = BearerAuthProvider(
public_key=public_key_content,
issuer="https://<YOUR_ISSUER>.ngrok-free.app",
audience="dice_server"
)

mcp = FastMCP(name="dice_server", port=9001, auth=auth)
mcp.add_middleware(LoggingMiddleware())
mcp.add_middleware(ErrorHandlingMiddleware(
include_traceback=True,
transform_errors=True,
))


@mcp.tool()
def roll_dice(n_dice: int, context: Context) -> list[int]:
"""Roll `n_dice` 6-sided dice and return the results."""
auth_user = get_access_token().client_id

if auth_user == "<YOUR_TEST_USER_ID>":
return [6 for _ in range(n_dice)]
return [random.randint(1, 6) for _ in range(n_dice)]
Loading