Skip to main content
Glama

Stata-MCP


Looking for others?

  • Trace DID: If you want to fetch the newest information about DID (Difference-in-Difference), click here. Now there is a Chinese translation by Sepine Tam and StataMCP-Team 🎉
  • Jupyter Lab Usage (Important: Stata 17+) here
  • NBER-MCP & AER-MCP 🔧 under construction
  • Econometrics-Agent
  • TexIV: A machine learning-driven framework that transforms text data into usable variables for empirical research using advanced NLP and ML techniques
  • A VScode or Cursor integrated here. Confused it? 💡 Difference

💡 Quick Start

Standard config requires: please make sure the stata is installed at the default path, and the stata cli (for macOS and Linux) exists.

The standard config json as follows, you can DIY your config via add envs.

{ "mcpServers": { "stata-mcp": { "command": "uvx", "args": [ "stata-mcp" ] } } }

For more detailed usage information, visit the Usage guide.

And some advanced usage, visit the Advanced guide

Prerequisites

  • uv - Package installer and virtual environment manager
  • Claude, Cline, ChatWise, or other LLM service
  • Stata License
  • Your API-KEY from LLM

Installation

For the new version, you don't need to install the stata-mcp package again, you can just use the following command to check whether your computer can use stata-mcp.

uvx stata-mcp --usable uvx stata-mcp --version

If you want to use it locally, you can install it via pip or download the source code.

Download via pip

pip install stata-mcp

Download source code and compile

git clone https://github.com/sepinetam/stata-mcp.git cd stata-mcp uv build

Then you can find the compiled stata-mcp binary in the dist directory. You can use it directly or add it to your PATH.

For example:

uvx /path/to/your/whl/stata_mcp-1.6.2-py3-non-any.whl # here is the wheel file name, you can change it to your version

📝 Documentation

💡 Questions

🚀 Roadmap

  • macOS support
  • Windows support
  • Additional LLM integrations
  • Performance optimizations

⚠️ Disclaimer

This project is for research purposes only. I am not responsible for any damage caused by this project. Please ensure you have proper licensing to use Stata.

For more information, refer to the Statement.

🐛 Report Issues

If you encounter any bugs or have feature requests, please open an issue.

📄 License

Apache License 2.0

📚 Citation

If you use Stata-MCP in your research, please cite this repository using one of the following formats:

BibTeX

@software{sepinetam2025stata, author = {Song Tan}, title = {Stata-MCP: Let LLM help you achieve your regression analysis with Stata}, year = {2025}, url = {https://github.com/sepinetam/stata-mcp}, version = {1.6.2} }

APA

Song Tan. (2025). Stata-MCP: Let LLM help you achieve your regression analysis with Stata (Version 1.6.0) [Computer software]. https://github.com/sepinetam/stata-mcp

Chicago

Song Tan. 2025. "Stata-MCP: Let LLM help you achieve your regression analysis with Stata." Version 1.6.0. https://github.com/sepinetam/stata-mcp.

📬 Contact

Email: sepinetam@gmail.com

Or contribute directly by submitting a Pull Request! We welcome contributions of all kinds, from bug fixes to new features.

❤️ Acknowledgements

The author sincerely thanks the Stata official team for their support and the Stata License for authorizing the test development.

✨ Star History

Install Server
A
security – no known vulnerabilities
A
license - permissive license
A
quality - confirmed to work

hybrid server

The server is able to function both locally and remotely, depending on the configuration or use case.

An MCP server that lets Large Language Models interact with Stata software to perform regression analysis and other statistical operations.

  1. Prerequisites
    1. Installation
      1. 📝 Documentation
        1. 💡 Questions
          1. 🚀 Roadmap
            1. ⚠️ Disclaimer
              1. 🐛 Report Issues
                1. 📄 License
                  1. 📚 Citation
                    1. BibTeX
                    2. APA
                    3. Chicago
                  2. 📬 Contact
                    1. ❤️ Acknowledgements
                      1. ✨ Star History

                        Related MCP Servers

                        • -
                          security
                          A
                          license
                          -
                          quality
                          An MCP server that enables Large Language Models to retrieve, analyze, and query metric data from Prometheus databases through pre-defined routes.
                          Last updated -
                          28
                          Python
                          MIT License
                          • Linux
                          • Apple
                        • -
                          security
                          F
                          license
                          -
                          quality
                          An MCP server that enables language models to access code intelligence features like completions, definitions, and references across multiple programming languages through the Language Server Protocol.
                          Last updated -
                          Python
                        • -
                          security
                          A
                          license
                          -
                          quality
                          An MCP server that enables large language models to interact directly with MongoDB databases, allowing them to query collections, inspect schemas, and manage data through natural language.
                          Last updated -
                          136
                          MIT License
                          • Apple
                        • -
                          security
                          A
                          license
                          -
                          quality
                          A high-performance Model Context Protocol (MCP) server designed for large language models, enabling real-time communication between AI models and applications with support for session management and intelligent tool registration.
                          Last updated -
                          2
                          Python
                          MIT License

                        View all related MCP servers

                        MCP directory API

                        We provide all the information about MCP servers via our MCP API.

                        curl -X GET 'https://glama.ai/api/mcp/v1/servers/SepineTam/stata-mcp'

                        If you have feedback or need assistance with the MCP directory API, please join our Discord server