hybrid server
The server is able to function both locally and remotely, depending on the configuration or use case.
Integrations
Enables execution of code in Docker containers, allowing LLMs to run isolated code environments for data processing or computation tasks.
Provides video processing capabilities including transcription and frame extraction to prepare video content for LLM analysis.
Allows querying and analyzing repository data using Git tools, enabling statistical analysis of commits and other Git operations.
GenAIScript
Prompting is Coding
Programmatically assemble prompts for LLMs using JavaScript. Orchestrate LLMs, tools, and data in code.
- JavaScript toolbox to work with prompts
- Abstraction to make it easy and productive
- Seamless Visual Studio Code integration or flexible command line
- Built-in support for GitHub Copilot and GitHub Models, OpenAI, Azure OpenAI, Anthropic, and more
- 📄 Read the ONLINE DOCUMENTATION at microsoft.github.io/genaiscript
- 📝 Read the blog for the latest news
- 📺 Watch Mr. Maeda's Cozy AI Kitchen
- 📺 Watch an interview on YouTube with nickyt
Hello world
Say to you want to create an LLM script that generates a 'hello world' poem. You can write the following script:
The $
function is a template tag that creates a prompt. The prompt is then sent to the LLM (you configured), which generates the poem.
Let's make it more interesting by adding files, data and structured output. Say you want to include a file in the prompt, and then save the output in a file. You can write the following script:
The def
function includes the content of the file, and optimizes it if necessary for the target LLM. GenAIScript script also parses the LLM output
and will extract the data.json
file automatically.
🚀 Quickstart Guide
Get started quickly by installing the Visual Studio Code Extension or using the command line.
✨ Features
🎨 Stylized JavaScript & TypeScript
Build prompts programmatically using JavaScript or TypeScript.
🚀 Fast Development Loop
Edit, Debug, Run, and Test your scripts in Visual Studio Code or with the command line.
🔗 Reuse and Share Scripts
Scripts are files! They can be versioned, shared, and forked.
📋 Data Schemas
Define, validate, and repair data using schemas. Zod support builtin.
📄 Ingest Text from PDFs, DOCX, ...
📊 Ingest Tables from CSV, XLSX, ...
Manipulate tabular data from CSV, XLSX, ...
📝 Generate Files
Extract files and diff from the LLM output. Preview changes in Refactoring UI.
🔍 File Search
Grep or fuzz search files.
Classify
Classify text, images or a mix of all.
LLM Tools
Register JavaScript functions as tools (with fallback for models that don't support tools). Model Context Protocol (MCP) tools are also supported.
LLM Agents
Register JavaScript functions as tools and combine tools + prompt into agents.
then use it as a tool
See the git agent source.
🔍 RAG Built-in
🐙 GitHub Models and GitHub Copilot
Run models through GitHub Models or GitHub Copilot.
💻 Local Models
Run your scripts with Open Source models, like Phi-3, using Ollama, LocalAI.
🐍 Code Interpreter
Let the LLM run code in a sand-boxed execution environment.
🐳 Containers
Run code in Docker containers.
Video processing
Transcribe and screenshot your videos so that you can feed them efficiently in your LLMs requests.
🧩 LLM Composition
Run LLMs to build your LLM prompts.
🅿️ Prompty support
Run your Prompty files as well!
Pluggable Secret Scanning
Scan your chats for secrets using secret scanning.
⚙ Automate with CLI or API
Automate using the CLI or API.
Safety First!
GenAIScript provides built-in Responsible AI system prompts and Azure Content Safety supports to validate content safety.
💬 Pull Request Reviews
Integrate into your Pull Requests checks through comments, reviews, or description updates. Supports GitHub Actions and Azure DevOps pipelines.
⭐ Tests and Evals
Build reliable prompts using tests and evals powered by promptfoo.
LLM friendly docs
The entire content of the docs is rendered at https://microsoft.github.io/genaiscript/llms-full.txt in markdown. Feed it directly into your favorite RAG system.
If you are an LLM crawler, add the .md
suffix to any documentation URLs to get a raw markdown content. For example, https://microsoft.github.io/genaiscript/guides/prompt-as-code.md (note the .md extension)
Contributing
We accept contributions! Checkout the CONTRIBUTING page for details and developer setup.
Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
This server cannot be installed
GenAIScript is a JavaScript runtime dedicated to build relaible, automatable LLM scripts. Every GenAIScript can be exposed as a MCP server automatically.