Skip to main content
Glama

RAGandLLM-MCP

by iijimam
pyproject.toml383 B
[project] name = "ragandllm-mcp" version = "0.1.0" description = "Let's enjoy RAG and LLM!" readme = "README.md" requires-python = ">=3.12" dependencies = [ "mcp>=1.12.0",] [[project.authors]] name = "mihokoiijima" email = "miijima@intersystems.com" [build-system] requires = [ "hatchling",] build-backend = "hatchling.build" [project.scripts] ragandllm-mcp = "ragandllm_mcp:main"

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/iijimam/RAGandLLM-MCP'

If you have feedback or need assistance with the MCP directory API, please join our Discord server