ProtoLinkAI MCP Server
ProtoLinkAI ๐
ProtoLink AI is a standardized tool wrapping framework for implementing and managing diverse tools in a unified way. It is designed to help developers quickly integrate and launch tool-based use cases.
Key Features
- ๐ง Standardized Wrapping: Provides an abstraction layer for building tools using the MCP protocol.
- ๐ Flexible Use Cases: Easily add or remove tools to fit your specific requirements.
- โจ Out-of-the-Box Tools: Includes pre-built tools for common scenarios:
- ๐ฆ Twitter Management: Automate tweeting, replying, and managing Twitter interactions.
- ๐ธ Crypto: Get the latest cryptocurrency prices.
- ๐ค ElizaOS Integration: Seamlessly connect and interact with ElizaOS for enhanced automation.
- ๐ Time utilities
- โ๏ธ Weather information (API)
- ๐ Dictionary lookups
- ๐งฎ Calculator for mathematical expressions
- ๐ต Currency exchange (API)
- ๐ Stocks Data: Access real-time and historical stock market information.
- [WIP] ๐ฐ News: Retrieve the latest news headlines.
Tech Stack ๐ ๏ธ
- Python: Core programming language
- MCP Framework: Communication protocol
- Docker: Containerization
๐ค What is MCP?
The Model Context Protocol (MCP) is a cutting-edge standard for context sharing and management across AI models and systems. Think of it as the language AI agents use to interact seamlessly. ๐ง โจ
Hereโs why MCP matters:
- ๐งฉ Standardization: MCP defines how context can be shared across models, enabling interoperability.
- โก Scalability: Itโs built to handle large-scale AI systems with high throughput.
- ๐ Security: Robust authentication and fine-grained access control.
- ๐ Flexibility: Works across diverse systems and AI architectures.
source
Installation ๐ฆ
Install via PyPI
Usage ๐ป
Run Locally
Run in Docker
- Build the Docker image:
docker build -t ProtoLinkai .
- Run the container:
docker run -i --rm ProtoLinkai
Twitter Integration ๐ฆ
MProtoLinkAI offers robust Twitter integration, allowing you to automate tweeting, replying, and managing Twitter interactions. This section provides detailed instructions on configuring and using the Twitter integration, both via Docker and .env
+ scripts/run_agent.sh
.
Docker Environment Variables for Twitter Integration
When running ProtoLinkAI within Docker, it's essential to configure environment variables for Twitter integration. These variables are divided into two categories:
1. Agent Node Client Credentials
These credentials are used by the Node.js client within the agent for managing Twitter interactions.
2. Tweepy (Twitter API v2) Credentials
These credentials are utilized by Tweepy for interacting with Twitter's API v2.
Running ProtoLinkAI with Docker
- Build the Docker image:Copydocker build -t ProtoLinkai .
- Run the container:Copydocker run -i --rm ProtoLinkai
Running ProtoLink with .env
+ scripts/run_agent.sh
Setting Up Environment Variables
Create a .env
file in the root directory of your project and add the following environment variables:
Running the Agent
- Make the script executable:Copychmod +x scripts/run_agent.sh
- Run the agent:Copybash scripts/run_agent.sh
Summary
You can configure ProtoLink to run with Twitter integration either using Docker or by setting up environment variables in a .env
file and running the scripts/run_agent.sh
script.
This flexibility allows you to choose the method that best fits your deployment environment.
ElizaOS Integration ๐ค
1. Directly Use Eliza Agents from ProtoLink
This approach allows you to use Eliza Agents without running the Eliza Framework in the background. It simplifies the setup by embedding Eliza functionality directly within ProtoLink.
Steps:
- Configure ProtoLink to Use Eliza MCP Agent:
In your Python code, add Eliza MCP Agent to the
MultiToolAgent
:Copyfrom ProtoLink.core.multi_tool_agent import MultiToolAgent from ProtoLink.tools.eliza_mcp_agent import eliza_mcp_agent multi_tool_agent = MultiToolAgent([ # ... other agents eliza_mcp_agent ])
Advantages:
- Simplified Setup: No need to manage separate background processes.
- Easier Monitoring: All functionalities are encapsulated within MCPAgentAI.
- Highlight Feature: Emphasizes the flexibility of MCPAgentAI in integrating various tools seamlessly.
2. Run Eliza Framework from ProtoLinkai
This method involves running the Eliza Framework as a separate background process alongside ProtoLinkAI.
Steps:
- Start Eliza Framework:
bash src/ProtoLinkai/tools/eliza/scripts/run.sh
- Monitor Eliza Processes:
bash src/ProtoLinkai/tools/eliza/scripts/monitor.sh
- Configure MCPAgentAI to Use Eliza Agent:
In your Python code, add Eliza Agent to the
MultiToolAgent
:Copyfrom ProtoLink.core.multi_tool_agent import MultiToolAgent from ProtoLink.tools.eliza_agent import eliza_agent multi_tool_agent = MultiToolAgent([ # ... other agents eliza_agent ])
Tutorial: Selecting Specific Tools
You can configure ProtoLink to run only certain tools by modifying the agent configuration in your server or by updating the server.py
file to only load desired agents. For example:
Integration Example: Claude Desktop Configuration
You can integrate ProtoLinkAI with Claude Desktop using the following configuration (claude_desktop_config.json
), note that local ElizaOS repo is optional arg:
Development ๐ ๏ธ
- Clone this repository:Copygit clone https://github.com/StevenROyola/ProtoLink.git cd mcpagentai
- (Optional) Create a virtual environment:Copypython3 -m venv .venv source .venv/bin/activate
- Install dependencies:Copypip install -e .
- Build the package:Copypython -m build
License: MIT
Enjoy! ๐
This server provides a standardized framework using the Model Context Protocol (MCP) to seamlessly integrate and manage diverse tools, enabling features like Twitter automation, cryptocurrency updates, and ElizaOS interaction.