Skip to main content
Glama

MCP Website Chatbot

A production-grade AI chatbot for srinivasanramanujam.sbs with live data retrieval via MCP (Model Context Protocol) and RAG (Retrieval-Augmented Generation).

šŸš€ Features

  • Live Data Integration – MCP tools for real-time information retrieval

  • RAG Support – Static knowledge base from website content, blogs, and FAQs

  • Hallucination Prevention – Strict guardrails against fabrication and misinformation

  • Beautiful UI – Modern, responsive chat interface

  • Production-Ready – Scalable backend with proper error handling

  • Health Monitoring – Built-in health checks and uptime tracking

šŸ“‹ Requirements

  • Node.js 16+

  • npm or yarn

  • OpenAI API key (for production use)

šŸ› ļø Installation

# Install dependencies npm install # Create .env file cat > .env << EOF PORT=3000 OPENAI_API_KEY=your_key_here EOF # Start the server npm run dev

šŸ“ Project Structure

ā”œā”€ā”€ server.js # Express server with chat API ā”œā”€ā”€ public/ │ └── index.html # Chat UI ā”œā”€ā”€ system_prompt.txt # System prompt for the chatbot └── package.json # Dependencies

šŸ”Œ API Endpoints

POST /api/chat

Send a message and get a response.

Request:

{ "message": "What's new on the website?", "conversationHistory": [] }

Response:

{ "success": true, "message": "Response text...", "context": { "requiresLiveData": true, "toolsUsed": ["fetchLiveData"], "timestamp": "2026-01-12T10:30:00Z" } }

GET /api/health

Check server health.

Response:

{ "status": "healthy", "timestamp": "2026-01-12T10:30:00Z", "uptime": 3600 }

GET /api/system-prompt

Retrieve the system prompt (for debugging).

šŸŽÆ How It Works

  1. User sends a message via the chat UI

  2. Server analyzes if live data is needed (time-sensitive, external sources)

  3. MCP tools are invoked if necessary to fetch real-time data

  4. Response is generated using the system prompt guidelines

  5. Assistant responds with proper citations and source attribution

šŸ” Security Features

  • āœ… No system prompt exposure to users

  • āœ… Input validation and sanitization

  • āœ… Rate limiting ready (add middleware as needed)

  • āœ… Error handling without leaking internal details

  • āœ… CORS headers (add if deploying to production)

🌐 Deployment

npm install -g vercel vercel

Option 2: Heroku

heroku create your-app-name git push heroku main

Option 3: Docker

Create a Dockerfile:

FROM node:18-alpine WORKDIR /app COPY package*.json ./ RUN npm ci --only=production COPY . . EXPOSE 3000 CMD ["npm", "start"]

šŸŽØ Customization

Update Website Info

Edit server.js and update the system prompt or knowledge base.

Change UI Theme

Modify the CSS in public/index.html gradient colors and styling.

Add Real API Integration

Replace mock MCP tools in server.js with real OpenAI/Claude API calls.

šŸ“ System Prompt Highlights

  • Live-first philosophy – Prioritizes current data over static knowledge

  • Hallucination prevention – Refuses to guess or invent information

  • Transparent reasoning – Cites sources and explains reasoning

  • Professional tone – Clear, concise, helpful communication

  • Safety guardrails – Rejects prompt injection and abuse

🚦 Next Steps for Production

  1. Integrate OpenAI/Claude API – Replace mock responses

  2. Add MCP server – Real connection to external tools

  3. Set up database – Store conversations and user data securely

  4. Add authentication – Protect sensitive endpoints

  5. Configure CORS – Allow cross-origin requests from your domain

  6. Enable logging – Monitor and debug in production

  7. Add rate limiting – Prevent abuse and control costs

šŸ“§ Support

For questions or issues, contact the site owner at srinivasanramanujam.sbs

šŸ“„ License

MIT License – See LICENSE file for details

-
security - not tested
F
license - not found
-
quality - not tested

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/writersrinivasan/MCPserverForWebsite'

If you have feedback or need assistance with the MCP directory API, please join our Discord server