Skip to main content
Glama

LitSynth MCP Server

pyproject.toml640 B
[project] name = "ai-research-assistant" version = "0.1.0" description = "The AI Research Assistant is an intelligent research companion designed to help researchers quickly find, filter, and summarize the latest scientific papers. By leveraging Mistral LLMs, Hugging Face embeddings, and a FastMCP server, this tool enables semantic search across papers from arXiv and datasets/models from Hugging Face Hub." readme = "README.md" requires-python = ">=3.11" dependencies = [ "datasets>=4.0.0", "fastmcp>=2.12.2", "feedparser>=6.0.11", "huggingface-hub>=0.34.4", "requests>=2.32.5", "sentence-transformers>=5.1.0", ]

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/RayaneChCh-dev/LitSynth-MCP-Server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server