Skip to main content
Glama
README.md1.56 kB
# Mnehmos Screen Vision Knowledge Base Knowledge base for the screen.vision Tauri desktop conversion project. Contains project reviews, architecture decisions, and development notes. ## Quick Start ```bash npm install npm run build npm start ``` ## Index Stats | Metric | Count | |--------|-------| | Sources | 1 | | Chunks | 9 | | Vectors | 9 | | Embedding Model | openai/text-embedding-3-small | ## Environment Variables | Variable | Required | Description | |----------|----------|-------------| | `PORT` | No | HTTP server port (default: 8765) | | `OPENAI_API_KEY` | For /chat | OpenAI API key for chat endpoint | | `OPENAI_MODEL` | No | Model for chat (default: gpt-5-nano-2025-08-07) | ## Deploy to Railway 1. Push to GitHub 2. Connect repo to Railway 3. Add `OPENAI_API_KEY` environment variable (for /chat) 4. Deploy ## HTTP Endpoints ### Health Check ```bash curl https://your-app.railway.app/health ``` ### Search ```bash curl -X POST https://your-app.railway.app/search \ -H "Content-Type: application/json" \ -d '{"query": "your search query", "mode": "keyword", "top_k": 10}' ``` ### Chat (RAG + LLM) ```bash curl -X POST https://your-app.railway.app/chat \ -H "Content-Type: application/json" \ -d '{"question": "What is...?"}' ``` ### List Sources ```bash curl https://your-app.railway.app/sources ``` ## MCP Integration Add to your MCP client config: ```json { "mcpServers": { "mnehmos-screen-vision": { "command": "node", "args": ["path/to/dist/index.js"] } } } ``` --- *Generated by IndexFoundry*

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Mnehmos/mnehmos.index-foundry.mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server