REACT_INTEGRATION.md•12 kB
# React Integration Guide
This guide shows how to integrate your React frontend with the MCP server using the HTTP bridge.
## 🚀 Quick Start
### 1. Start the MCP HTTP Bridge
```bash
cd Z:\Code\MCP
python http_bridge.py
```
The HTTP API will be available at: `http://localhost:8000`
### 2. API Endpoints
#### Health Check
```bash
GET http://localhost:8000/health
```
#### Chat with LLM
```bash
POST http://localhost:8000/api/chat
Content-Type: application/json
{
"message": "Hello, how are you?",
"model": "mistral:latest",
"temperature": 0.7,
"max_tokens": 1000
}
```
#### Store Memory
```bash
POST http://localhost:8000/api/memory/store
Content-Type: application/json
{
"conversation_id": "chat_123",
"content": "User asked about AI capabilities",
"metadata": {"topic": "AI"},
"role": "user",
"importance": 1.0
}
```
#### Get Memory
```bash
POST http://localhost:8000/api/memory/get
Content-Type: application/json
{
"conversation_id": "chat_123",
"limit": 10,
"min_importance": 0.0
}
```
#### Get Available Models
```bash
GET http://localhost:8000/api/models
```
#### Search Memory
```bash
GET http://localhost:8000/api/memory/search?query=AI&limit=5
```
## 📱 React Integration
### Install Dependencies
```bash
npm install axios
# or
yarn add axios
```
### API Service (React)
Create `src/services/mcpApi.js`:
```javascript
import axios from 'axios';
const MCP_BASE_URL = 'http://localhost:8000';
class MCPApiService {
constructor() {
this.client = axios.create({
baseURL: MCP_BASE_URL,
headers: {
'Content-Type': 'application/json',
},
});
}
// Health check
async checkHealth() {
const response = await this.client.get('/health');
return response.data;
}
// Chat with LLM
async chat(message, model = 'mistral:latest', options = {}) {
const response = await this.client.post('/api/chat', {
message,
model,
temperature: options.temperature || 0.7,
max_tokens: options.max_tokens || 1000,
});
return response.data;
}
// Memory operations
async storeMemory(conversationId, content, metadata = {}, role = 'user', importance = 1.0) {
const response = await this.client.post('/api/memory/store', {
conversation_id: conversationId,
content,
metadata,
role,
importance,
});
return response.data;
}
async getMemory(conversationId, limit = null, minImportance = 0.0) {
const response = await this.client.post('/api/memory/get', {
conversation_id: conversationId,
limit,
min_importance: minImportance,
});
return response.data;
}
async searchMemory(query, conversationId = null, limit = 10) {
const params = new URLSearchParams({ query, limit });
if (conversationId) params.append('conversation_id', conversationId);
const response = await this.client.get(\`/api/memory/search?\${params}\`);
return response.data;
}
// Get available models
async getModels() {
const response = await this.client.get('/api/models');
return response.data;
}
// Echo test
async echo(text) {
const response = await this.client.post('/api/echo', { text });
return response.data;
}
}
export default new MCPApiService();
```
### React Hook for MCP
Create `src/hooks/useMCP.js`:
```javascript
import { useState, useEffect, useCallback } from 'react';
import mcpApi from '../services/mcpApi';
export const useMCP = () => {
const [isConnected, setIsConnected] = useState(false);
const [isLoading, setIsLoading] = useState(false);
const [error, setError] = useState(null);
const [availableModels, setAvailableModels] = useState([]);
// Check connection on mount
useEffect(() => {
checkConnection();
}, []);
const checkConnection = useCallback(async () => {
try {
await mcpApi.checkHealth();
setIsConnected(true);
setError(null);
// Load available models
const models = await mcpApi.getModels();
setAvailableModels(models.models);
} catch (err) {
setIsConnected(false);
setError('Cannot connect to MCP server. Make sure it\'s running on localhost:8000');
}
}, []);
const chat = useCallback(async (message, model, options) => {
setIsLoading(true);
setError(null);
try {
const response = await mcpApi.chat(message, model, options);
return response;
} catch (err) {
setError(err.response?.data?.detail || 'Chat failed');
throw err;
} finally {
setIsLoading(false);
}
}, []);
const storeMemory = useCallback(async (conversationId, content, metadata) => {
try {
const response = await mcpApi.storeMemory(conversationId, content, metadata);
return response;
} catch (err) {
setError(err.response?.data?.detail || 'Memory storage failed');
throw err;
}
}, []);
const getMemory = useCallback(async (conversationId, limit) => {
try {
const response = await mcpApi.getMemory(conversationId, limit);
return response;
} catch (err) {
setError(err.response?.data?.detail || 'Memory retrieval failed');
throw err;
}
}, []);
return {
isConnected,
isLoading,
error,
availableModels,
checkConnection,
chat,
storeMemory,
getMemory,
mcpApi,
};
};
```
### Example React Component
Create `src/components/MCPChat.jsx`:
```jsx
import React, { useState, useEffect } from 'react';
import { useMCP } from '../hooks/useMCP';
const MCPChat = () => {
const { isConnected, isLoading, error, availableModels, chat, storeMemory, getMemory } = useMCP();
const [message, setMessage] = useState('');
const [chatHistory, setChatHistory] = useState([]);
const [selectedModel, setSelectedModel] = useState('mistral:latest');
const [conversationId] = useState(\`chat_\${Date.now()}\`);
// Load conversation memory on mount
useEffect(() => {
if (isConnected) {
loadMemory();
}
}, [isConnected]);
const loadMemory = async () => {
try {
const memory = await getMemory(conversationId, 20);
const messages = memory.memories.map(m => ({
role: m.role,
content: m.content,
timestamp: m.timestamp,
}));
setChatHistory(messages.reverse());
} catch (err) {
console.error('Failed to load memory:', err);
}
};
const sendMessage = async () => {
if (!message.trim() || isLoading) return;
const userMessage = { role: 'user', content: message, timestamp: new Date().toISOString() };
setChatHistory(prev => [...prev, userMessage]);
// Store user message in memory
await storeMemory(conversationId, message, { type: 'user_message' }, 'user');
try {
const response = await chat(message, selectedModel);
const assistantMessage = {
role: 'assistant',
content: response.response,
timestamp: new Date().toISOString(),
model: response.model,
};
setChatHistory(prev => [...prev, assistantMessage]);
// Store assistant response in memory
await storeMemory(conversationId, response.response, {
type: 'assistant_response',
model: response.model,
provider: response.provider
}, 'assistant');
} catch (err) {
console.error('Chat failed:', err);
}
setMessage('');
};
if (!isConnected) {
return (
<div className="mcp-error">
<h3>MCP Server Not Connected</h3>
<p>{error || 'Make sure the MCP HTTP bridge is running on localhost:8000'}</p>
<button onClick={() => window.location.reload()}>Retry</button>
</div>
);
}
return (
<div className="mcp-chat">
<div className="chat-header">
<h2>MCP Chat with Ollama</h2>
<select
value={selectedModel}
onChange={(e) => setSelectedModel(e.target.value)}
className="model-select"
>
{availableModels.ollama?.map(model => (
<option key={model} value={model}>{model}</option>
))}
</select>
</div>
<div className="chat-messages">
{chatHistory.map((msg, index) => (
<div key={index} className={\`message \${msg.role}\`}>
<div className="message-role">{msg.role}</div>
<div className="message-content">{msg.content}</div>
{msg.model && <div className="message-model">Model: {msg.model}</div>}
</div>
))}
</div>
<div className="chat-input">
<input
type="text"
value={message}
onChange={(e) => setMessage(e.target.value)}
onKeyPress={(e) => e.key === 'Enter' && sendMessage()}
placeholder="Type your message..."
disabled={isLoading}
/>
<button onClick={sendMessage} disabled={isLoading || !message.trim()}>
{isLoading ? 'Sending...' : 'Send'}
</button>
</div>
{error && <div className="error-message">{error}</div>}
</div>
);
};
export default MCPChat;
```
### CSS Styles
Add to your CSS file:
```css
.mcp-chat {
max-width: 800px;
margin: 0 auto;
padding: 20px;
border: 1px solid #ddd;
border-radius: 8px;
}
.chat-header {
display: flex;
justify-content: space-between;
align-items: center;
margin-bottom: 20px;
}
.model-select {
padding: 5px 10px;
border: 1px solid #ddd;
border-radius: 4px;
}
.chat-messages {
height: 400px;
overflow-y: auto;
border: 1px solid #eee;
padding: 10px;
margin-bottom: 20px;
background: #f9f9f9;
}
.message {
margin-bottom: 15px;
padding: 10px;
border-radius: 8px;
}
.message.user {
background: #e3f2fd;
margin-left: 50px;
}
.message.assistant {
background: #f3e5f5;
margin-right: 50px;
}
.message-role {
font-weight: bold;
font-size: 0.9em;
margin-bottom: 5px;
text-transform: capitalize;
}
.message-content {
line-height: 1.4;
}
.message-model {
font-size: 0.8em;
color: #666;
margin-top: 5px;
}
.chat-input {
display: flex;
gap: 10px;
}
.chat-input input {
flex: 1;
padding: 10px;
border: 1px solid #ddd;
border-radius: 4px;
}
.chat-input button {
padding: 10px 20px;
background: #007bff;
color: white;
border: none;
border-radius: 4px;
cursor: pointer;
}
.chat-input button:disabled {
background: #ccc;
cursor: not-allowed;
}
.error-message {
color: #d32f2f;
background: #ffebee;
padding: 10px;
border-radius: 4px;
margin-top: 10px;
}
.mcp-error {
text-align: center;
padding: 40px;
background: #fff3cd;
border: 1px solid #ffeaa7;
border-radius: 8px;
}
```
## 🔄 Development Workflow
### Terminal Setup
1. **Terminal 1** - MCP Server:
```bash
cd Z:\Code\MCP
python http_bridge.py
```
2. **Terminal 2** - React App:
```bash
cd /path/to/your/react/app
npm start
```
3. **Terminal 3** - Ollama (if not running as service):
```bash
ollama serve
```
### Testing the Integration
1. Start the MCP HTTP bridge
2. Navigate to `http://localhost:8000/docs` to see the API documentation
3. Test endpoints with curl or your React app
4. Use the React components to build your chat interface
## 🎯 Usage in Your React App
Simply import and use the MCP components:
```jsx
import MCPChat from './components/MCPChat';
function App() {
return (
<div className="App">
<h1>My App with MCP Integration</h1>
<MCPChat />
</div>
);
}
```
## 🔧 Customization
- Modify the `MCPApiService` to add new endpoints
- Extend the `useMCP` hook for additional functionality
- Create specialized components for different use cases
- Add error handling and retry logic as needed
## 📚 Additional Resources
- FastAPI docs: http://localhost:8000/docs
- Health check: http://localhost:8000/health
- Available models: http://localhost:8000/api/models
Your React app can now communicate seamlessly with your MCP server and Ollama models!