Skip to content

Commit a377396

Browse files
committed
docs: add readme
1 parent e1c45e7 commit a377396

File tree

1 file changed

+135
-0
lines changed

1 file changed

+135
-0
lines changed
Lines changed: 135 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
# Model Context Protocol (MCP) - Proof of Concept
2+
3+
This directory contains a **proof of concept (MVP)** demonstrating how Model Context Protocol (MCP) works. This is **NOT** the full OpenEdx AI Extensions integration yet, but rather a working example to understand MCP concepts and test the infrastructure.
4+
5+
## Overview
6+
7+
This MVP demonstrates the MCP architecture and workflow using a simple dice-rolling example. The implementation consists of three main components:
8+
9+
1. **`server.py`** - A FastMCP server with an example tool (`roll_dice`)
10+
2. **`run_server.py`** - Script to run the MCP server in HTTP mode
11+
3. **`client_example.py`** - Example client showing how to interact with the MCP server
12+
13+
**Note:** The actual OpenEdx-specific tools and integration will be implemented in future iterations. This MVP focuses on validating the MCP infrastructure and communication patterns.
14+
15+
## Architecture
16+
17+
The current implementation uses:
18+
- **FastMCP** - A framework for building MCP servers
19+
- **Streamable HTTP** transport - Allows the server to be exposed via HTTP
20+
- **LiteLLM** - For integrating the MCP server with language models
21+
22+
### Example Tool: `roll_dice`
23+
24+
The server currently implements a simple example tool that rolls dice. **This is a demonstration tool only** to show how MCP tools work. This pattern will later be extended to implement OpenEdx-specific operations like course management, user administration, content creation, etc.
25+
26+
## Setup
27+
28+
### Prerequisites
29+
30+
Install the required dependencies:
31+
32+
```bash
33+
pip install fastmcp litellm openai
34+
```
35+
36+
### Environment Variables
37+
38+
Set your OpenAI API key:
39+
40+
```bash
41+
export OPENAI_API_KEY="your_openai_api_key_here"
42+
```
43+
44+
Or update it directly in `client_example.py`.
45+
46+
## Running the Server
47+
48+
### Step 1: Start the MCP Server
49+
50+
Run the server locally on port 9001:
51+
52+
```bash
53+
python run_server.py
54+
```
55+
56+
The server will start and listen on `http://127.0.0.1:9001/mcp`.
57+
58+
### Step 2: Expose the Server with ngrok
59+
60+
Since the MCP protocol requires a publicly accessible endpoint for certain use cases, you need to expose your local server using ngrok:
61+
62+
```bash
63+
# Install ngrok if you haven't already
64+
# Visit https://ngrok.com/ to download and set up
65+
66+
# Expose port 9001
67+
ngrok http 9001
68+
```
69+
70+
ngrok will provide you with a public URL like:
71+
```
72+
https://abc123.ngrok-free.app
73+
```
74+
75+
**Important**: Copy the ngrok URL (including the subdomain) as you'll need it for the client configuration.
76+
77+
### Step 3: Update the Client Configuration
78+
79+
Edit `client_example.py` and update the `server_url` with your ngrok URL:
80+
81+
```python
82+
tools=[
83+
{
84+
"type": "mcp",
85+
"server_label": "dice_server",
86+
"server_url": "https://<your_ngrok_subdomain>.ngrok-free.app/mcp/",
87+
"require_approval": "never",
88+
},
89+
],
90+
```
91+
92+
Replace `<your_ngrok_subdomain>` with your actual ngrok subdomain (e.g., `abc123.ngrok-free.app`).
93+
94+
### Step 4: Run the Client Example
95+
96+
In a new terminal (while the server and ngrok are still running):
97+
98+
```bash
99+
python client_example.py
100+
```
101+
102+
## Testing Workflow
103+
104+
Here's the complete workflow for testing:
105+
106+
1. **Terminal 1** - Start the MCP server:
107+
```bash
108+
cd backend/openedx_ai_extensions/mcp
109+
python run_server.py
110+
```
111+
112+
2. **Terminal 2** - Expose with ngrok:
113+
```bash
114+
ngrok http 9001
115+
```
116+
Copy the ngrok URL from the output.
117+
118+
3. **Terminal 3** - Run the client:
119+
```bash
120+
# Update client_example.py with your ngrok URL first
121+
python client_example.py
122+
```
123+
124+
## Expected Output
125+
126+
When running the client, you should see:
127+
128+
1. List of available tools from the MCP server
129+
2. The AI model response after using the `roll_dice` tool
130+
131+
Example:
132+
```
133+
Available resources: ['roll_dice']
134+
Response: <LiteLLM response object with dice roll results>
135+
```

0 commit comments

Comments
 (0)