Skip to main content
Glama
DriftOS

DriftOS MCP Server

Official
by DriftOS

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
DRIFTOS_API_URLNoThe URL of the DriftOS backend APIhttp://localhost:3000

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
driftos_route_message

Route a message to the appropriate conversation branch using semantic drift detection.

Returns one of three actions:

  • BRANCH: New topic detected, creates a new branch

  • STAY: Message continues the current topic

  • ROUTE: Returns to a previously discussed topic

Args:

  • conversation_id (string): Unique identifier for the conversation

  • content (string): The message content to route

  • role ('user' | 'assistant'): Who sent the message (default: 'user')

Returns: { "action": "BRANCH" | "STAY" | "ROUTE", "branchId": string, "branchTopic": string, "confidence": number, "isNewBranch": boolean }

Example:

  • "I want to buy a house in London" -> BRANCH (new topic)

  • "What areas have good schools?" -> STAY (same topic)

  • "Back to houses - what about mortgage rates?" -> ROUTE (returns to previous branch)

driftos_get_context

Get assembled context for a conversation branch, including messages and facts from related branches.

This is what you pass to an LLM instead of the entire conversation history. Returns only the relevant messages from the current branch plus accumulated facts.

Args:

  • branch_id (string): The branch ID to get context for (returned from route_message)

Returns: { "branchId": string, "branchTopic": string, "messages": [ { "role": "user" | "assistant", "content": string } ], "allFacts": [ { "branchTopic": string, "isCurrent": boolean, "facts": [{ "key": string, "value": string, "confidence": number }] } ] }

Use this to build focused LLM context windows instead of dumping entire conversation history.

driftos_build_prompt

Build a ready-to-use prompt for LLM calls with context and facts.

Args:

  • branch_id (string): The branch ID to build prompt for

  • system_prompt (string, optional): Custom system prompt prefix

Returns: { "system": string, // Full system prompt with topic and facts "messages": [{ "role": string, "content": string }] // Conversation messages }

Use this to get a complete prompt ready for OpenAI/Anthropic/etc API calls.

driftos_list_branches

List all branches in a conversation with their topics and message counts.

Use this to understand the structure of a conversation and see what topics have been discussed.

Args:

  • conversation_id (string): Unique identifier for the conversation

Returns: [ { "id": string, "topic": string, "messageCount": number, "isActive": boolean } ]

driftos_get_facts

Get extracted facts from a specific branch.

Args:

  • branch_id (string): The branch ID to get facts for

Returns: [{ "key": string, "value": string, "confidence": number }]

driftos_extract_facts

Trigger fact extraction for a branch. Use when you want to explicitly extract facts from the current conversation state.

Args:

  • branch_id (string): The branch ID to extract facts from

Returns: { "facts": [{ "key": string, "value": string }] }

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/DriftOS/driftos-mcp-server'

If you have feedback or need assistance with the MCP directory API, please join our Discord server