Skip to main content
Glama

Model Context Protocol Server

by arkeodev

Search Engine with RAG and MCP

A powerful search engine that combines LangChain, Model Context Protocol (MCP), Retrieval-Augmented Generation (RAG), and Ollama to create an agentic AI system capable of searching the web, retrieving information, and providing relevant answers.

Features

  • Web search capabilities using the Exa API
  • Web content retrieval using FireCrawl
  • RAG (Retrieval-Augmented Generation) for more relevant information extraction
  • MCP (Model Context Protocol) server for standardized tool invocation
  • Support for both local LLMs via Ollama and cloud-based LLMs via OpenAI
  • Flexible architecture supporting direct search, agent-based search, or server mode
  • Comprehensive error handling and graceful fallbacks
  • Python 3.13+ with type hints
  • Asynchronous processing for efficient web operations

Architecture

This project integrates several key components:

  1. Search Module: Uses Exa API to search the web and FireCrawl to retrieve content
  2. RAG Module: Embeds documents, chunks them, and stores them in a FAISS vector store
  3. MCP Server: Provides a standardized protocol for tool invocation
  4. Agent: LangChain-based agent that uses the search and RAG capabilities

Project Structure

search-engine-with-rag-and-mcp/ ├── LICENSE # MIT License ├── README.md # Project documentation ├── data/ # Data directories ├── docs/ # Documentation │ └── env_template.md # Environment variables documentation ├── logs/ # Log files directory (auto-created) ├── src/ # Main package (source code) │ ├── __init__.py │ ├── core/ # Core functionality │ │ ├── __init__.py │ │ ├── main.py # Main entry point │ │ ├── search.py # Web search module │ │ ├── rag.py # RAG implementation │ │ ├── agent.py # LangChain agent │ │ └── mcp_server.py # MCP server implementation │ └── utils/ # Utility modules │ ├── __init__.py │ ├── env.py # Environment variable loading │ └── logger.py # Logging configuration ├── pyproject.toml # Poetry configuration ├── requirements.txt # Project dependencies └── tests/ # Test directory

Getting Started

Prerequisites

  • Python 3.13+
  • Poetry (optional, for development)
  • API keys for Exa and FireCrawl
  • (Optional) Ollama installed locally
  • (Optional) OpenAI API key

Installation

  1. Clone the repository
git clone https://github.com/yourusername/search-engine-with-rag-and-mcp.git cd search-engine-with-rag-and-mcp
  1. Install dependencies
# Using pip pip install -r requirements.txt # Or using poetry poetry install
  1. Create a .env file (use docs/env_template.md as a reference)

Usage

The application has three main modes of operation:

1. Direct Search Mode (Default)
# Using pip python -m src.core.main "your search query" # Or using poetry poetry run python -m src.core.main "your search query"
2. Agent Mode
python -m src.core.main --agent "your search query"
3. MCP Server Mode
python -m src.core.main --server

You can also specify custom host and port:

python -m src.core.main --server --host 0.0.0.0 --port 8080

Using Ollama (Optional)

To use Ollama for local embeddings and LLM capabilities:

  1. Install Ollama: https://ollama.ai/
  2. Pull a model:
ollama pull mistral:latest
  1. Set the appropriate environment variables in your .env file:
OLLAMA_BASE_URL=http://localhost:11434 OLLAMA_MODEL=mistral:latest

Development

This project follows these best practices:

  • Code formatting: Black and isort for consistent code style
  • Type checking: mypy for static type checking
  • Linting: flake8 for code quality
  • Testing: pytest for unit and integration tests
  • Environment Management: python-dotenv for managing environment variables
  • Logging: Structured logging to both console and file

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgements

-
security - not tested
A
license - permissive license
-
quality - not tested

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

提供工具调用的标准化协议,使AI系统能够通过与LangChain、RAG和Ollama的集成来搜索网络、检索信息并提供相关答案。

  1. 特征
    1. 建筑学
      1. 项目结构
        1. 入门
          1. 先决条件
          2. 安装
          3. 用法
          4. 使用 Ollama(可选)
        2. 发展
          1. 执照
            1. 致谢

              Related MCP Servers

              • -
                security
                F
                license
                -
                quality
                Enables seamless AI integration via Ollama's Deepseek model, providing protocol compliance and automatic configuration for clean AI-driven interactions.
                Last updated -
                1
                Python
              • A
                security
                F
                license
                A
                quality
                A Model Context Protocol server that enables AI assistants to perform real-time web searches, retrieving up-to-date information from the internet via a Crawler API.
                Last updated -
                1
                752
                15
                JavaScript
                • Apple
                • Linux
              • -
                security
                F
                license
                -
                quality
                A simple Model Context Protocol server that enables searching and retrieving relevant documentation snippets from Langchain, Llama Index, and OpenAI official documentation.
                Last updated -
                Python
                • Apple
                • Linux
              • -
                security
                F
                license
                -
                quality
                An integration that enables AI assistants to interact with network data through a standardized protocol, providing AI-ready tools and interfaces for network automation and management.
                Last updated -
                15
                Python

              View all related MCP servers

              MCP directory API

              We provide all the information about MCP servers via our MCP API.

              curl -X GET 'https://glama.ai/api/mcp/v1/servers/arkeodev/search-engine-with-rag-and-mcp'

              If you have feedback or need assistance with the MCP directory API, please join our Discord server