Skip to main content
Glama

Brightsy MCP Server

by mattlevine

Brightsy MCP Server

This is a Model Context Protocol (MCP) server that connects to an Brightsy AI agent.

Installation

npm install

Usage

To start the server:

npm start -- --agent-id=<your-agent-id> --api-key=<your-api-key>

Or with positional arguments:

npm start -- <your-agent-id> <your-api-key> [tool-name] [message]

You can also provide an initial message to be sent to the agent:

npm start -- --agent-id=<your-agent-id> --api-key=<your-api-key> --message="Hello, agent!"

Customizing the Tool Name

By default, the MCP server registers a tool named "brightsy". You can customize this name using the --tool-name parameter:

npm start -- --agent-id=<your-agent-id> --api-key=<your-api-key> --tool-name=<custom-tool-name>

You can also set the tool name as the third positional argument:

npm start -- <your-agent-id> <your-api-key> <custom-tool-name>

Or using the BRIGHTSY_TOOL_NAME environment variable:

export BRIGHTSY_TOOL_NAME=custom-tool-name npm start -- --agent-id=<your-agent-id> --api-key=<your-api-key>

Environment Variables

The following environment variables can be used to configure the server:

  • BRIGHTSY_AGENT_ID: The agent ID to use (alternative to command line argument)
  • BRIGHTSY_API_KEY: The API key to use (alternative to command line argument)
  • BRIGHTSY_TOOL_NAME: The tool name to register (default: "brightsy")

Testing the agent_proxy Tool

The agent_proxy tool allows you to proxy requests to an Brightsy AI agent. To test this tool, you can use the provided test scripts.

Prerequisites

Before running the tests, set the following environment variables:

export AGENT_ID=your-agent-id export API_KEY=your-api-key # Optional: customize the tool name for testing export TOOL_NAME=custom-tool-name

Alternatively, you can pass these values as command-line arguments:

# Using named arguments npm run test:cli -- --agent-id=your-agent-id --api-key=your-api-key --tool-name=custom-tool-name # Using positional arguments npm run test:cli -- your-agent-id your-api-key custom-tool-name

Running the Tests

To run all tests:

npm test

To run specific tests:

# Test using the command line interface npm run test:cli # Test using the direct MCP protocol npm run test:direct

Test Scripts

  1. Command Line Test (test-agent-proxy.ts): Tests the agent_proxy tool by running the MCP server with a test message.
  2. Direct MCP Protocol Test (test-direct.ts): Tests the agent_proxy tool by sending a properly formatted MCP request directly to the server.

How the Tool Works

The MCP server registers a tool (named "brightsy" by default) that forwards requests to an OpenAI-compatible AI agent and returns the response. It takes a messages parameter, which is an array of message objects with role and content properties.

Example usage in an MCP client:

// Using the default tool name const response = await client.callTool("brightsy", { messages: [ { role: "user", content: "Hello, can you help me with a simple task?" } ] }); // Or using a custom tool name if configured const response = await client.callTool("custom-tool-name", { messages: [ { role: "user", content: "Hello, can you help me with a simple task?" } ] });

The response will contain the agent's reply in the content field.

Install Server
A
security – no known vulnerabilities
F
license - not found
A
quality - confirmed to work

remote-capable server

The server can be hosted and run remotely because it primarily relies on remote services or has no dependency on the local environment.

LLM을 Brightsy AI 에이전트에 연결하여 사용자가 이러한 에이전트에 메시지를 전달하고 응답을 받을 수 있도록 모델 컨텍스트 프로토콜을 구현하는 서버입니다.

  1. 설치
    1. 용법
      1. 도구 이름 사용자 정의
      2. 환경 변수
    2. agent_proxy 도구 테스트
      1. 필수 조건
      2. 테스트 실행
      3. 테스트 스크립트
    3. 도구 작동 방식

      Related MCP Servers

      • A
        security
        A
        license
        A
        quality
        A Model Context Protocol server that provides LLM Agents with a comprehensive toolset for IP geolocation, network diagnostics, system monitoring, cryptographic operations, and QR code generation.
        Last updated -
        16
        627
        11
        TypeScript
        Apache 2.0
      • A
        security
        F
        license
        A
        quality
        A Model Context Protocol server that enables AI assistants to interact with Bluesky/ATProtocol, providing authentication, timeline access, post creation, and social features like likes and follows.
        Last updated -
        21
        29
        TypeScript
        • Apple
        • Linux
      • -
        security
        A
        license
        -
        quality
        Model Context Protocol server implementation that integrates the LINE Messaging API to connect AI agents with LINE Official Accounts, enabling agents to send messages to users.
        Last updated -
        400
        TypeScript
        Apache 2.0
      • A
        security
        A
        license
        A
        quality
        A Model Context Protocol server that provides AI assistants with access to Glean's enterprise knowledge features, including content search, people directory, and AI chat capabilities.
        Last updated -
        3
        0
        45
        TypeScript
        MIT License

      View all related MCP servers

      MCP directory API

      We provide all the information about MCP servers via our MCP API.

      curl -X GET 'https://glama.ai/api/mcp/v1/servers/mattlevine/brightsy-mcp'

      If you have feedback or need assistance with the MCP directory API, please join our Discord server