Sequential Questioning MCP Server

Integrations

  • Enables containerized deployment of the Sequential Questioning server through Docker Compose for local development environments

  • Provides automated CI/CD pipeline for linting, testing, and deploying the Sequential Questioning server to various environments

  • Offers visualization dashboards for monitoring the Sequential Questioning server's performance metrics

Sequential Questioning MCP Server

A specialized server that enables LLMs (Large Language Models) to gather specific information through sequential questioning. This project implements the MCP (Model Control Protocol) standard for seamless integration with LLM clients.

Project Status

🎉 Version 1.0.0 Released 🎉

The Sequential Questioning MCP Server is now complete and ready for production deployment. All planned features have been implemented, tested, and documented.

Features

  • Sequential Questioning Engine: Generates contextually appropriate follow-up questions based on previous responses
  • MCP Protocol Support: Full implementation of the MCP specification for integration with LLMs
  • Robust API: RESTful API with comprehensive validation and error handling
  • Vector Database Integration: Efficient storage and retrieval of question patterns
  • Comprehensive Monitoring: Performance metrics and observability with Prometheus and Grafana
  • Production-Ready Deployment: Kubernetes deployment configuration with multi-environment support
  • High Availability: Horizontal Pod Autoscaler and Pod Disruption Budget for production reliability
  • Security: Network policies to restrict traffic and secure the application

Documentation

Getting Started

Prerequisites

  • Python 3.10+
  • Docker and Docker Compose (for local development)
  • Kubernetes cluster (for production deployment)
  • PostgreSQL 15.4+
  • Access to a Qdrant instance

Quick Start

The easiest way to get started is to use our initialization script:

./scripts/initialize_app.sh

This script will:

  1. Check if Docker is running
  2. Start all necessary containers with Docker Compose
  3. Run database migrations automatically
  4. Provide information on how to access the application

The application will be available at http://localhost:8001

Local Development

  1. Clone the repository
    git clone https://github.com/your-organization/sequential-questioning.git cd sequential-questioning
  2. Install dependencies
    pip install -e ".[dev]"
  3. Set up environment variables
    cp .env.example .env # Edit .env file with your configuration
  4. Run the development server
    uvicorn app.main:app --reload

Docker Deployment

docker-compose up -d

Database Setup

If you're starting the application manually, don't forget to run the database migrations:

export DATABASE_URL="postgresql://postgres:postgres@localhost:5432/postgres" bash scripts/run_migrations.sh

Kubernetes Deployment

  1. Development Environment
    kubectl apply -k k8s/overlays/dev
  2. Staging Environment
    kubectl apply -k k8s/overlays/staging
  3. Production Environment
    kubectl apply -k k8s/overlays/prod

See the Final Deployment Plan and Operational Runbook for detailed instructions.

Monitoring

Access Prometheus and Grafana dashboards for monitoring:

kubectl port-forward -n monitoring svc/prometheus 9090:9090 kubectl port-forward -n monitoring svc/grafana 3000:3000

CI/CD Pipeline

Automated CI/CD pipeline with GitHub Actions:

  • Continuous Integration: Linting, type checking, and testing
  • Continuous Deployment: Automated deployments to dev, staging, and production
  • Deployment Verification: Automated checks post-deployment

Testing

Run the test suite:

pytest

Run performance tests:

python -m tests.performance.test_sequential_questioning_load

Troubleshooting

Database Tables Not Created

If the application is running but the database tables don't exist:

  1. Make sure the database container is running
  2. Run the database migrations manually:
    export DATABASE_URL="postgresql://postgres:postgres@localhost:5432/postgres" bash scripts/run_migrations.sh

Pydantic Version Compatibility

If you encounter the error pydantic.errors.PydanticImportError: BaseSettings has been moved to the pydantic-settings package, ensure that:

  1. The pydantic-settings package is included in your dependencies
  2. You're importing BaseSettings from pydantic_settings instead of directly from pydantic

This project uses Pydantic v2.x which moved BaseSettings to a separate package.

Contributing

Contributions are welcome! Please see CONTRIBUTING.md for guidelines.

License

MIT License

Contact

For support or inquiries, contact support@example.com

-
security - not tested
F
license - not found
-
quality - not tested

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

A specialized server that enables LLMs to gather specific information through sequential questioning, implementing the MCP standard for seamless integration with LLM clients.

  1. Project Status
    1. Features
      1. Documentation
        1. Getting Started
          1. Prerequisites
          2. Quick Start
          3. Local Development
          4. Docker Deployment
          5. Database Setup
          6. Kubernetes Deployment
        2. Monitoring
          1. CI/CD Pipeline
            1. Testing
              1. Troubleshooting
                1. Database Tables Not Created
                2. Pydantic Version Compatibility
              2. Contributing
                1. License
                  1. Contact

                    Related MCP Servers

                    • A
                      security
                      A
                      license
                      A
                      quality
                      An MCP server that provides LLMs access to other LLMs
                      Last updated -
                      4
                      14
                      12
                      JavaScript
                      MIT License
                    • -
                      security
                      A
                      license
                      -
                      quality
                      A server for the Machine Chat Protocol (MCP) that provides a YAML-based configuration system for LLM applications, allowing users to define resources, tools, and prompts without writing code.
                      Last updated -
                      5
                      Python
                      MIT License
                    • -
                      security
                      F
                      license
                      -
                      quality
                      An MCP server that implements a structured workflow for LLM-based coding, guiding development through feature clarification, documentation generation, phased implementation, and progress tracking.
                      Last updated -
                      8
                      TypeScript
                      • Apple
                    • -
                      security
                      A
                      license
                      -
                      quality
                      An MCP server that helps novice developers deploy web applications through conversational interfaces, bridging the gap between LLMs and various hosting environments.
                      Last updated -
                      1
                      Python
                      MIT License
                      • Apple

                    View all related MCP servers

                    ID: w1ndcqpfes