Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LM_STUDIO_URLNoLM Studio API endpoint URL for visual analysishttp://localhost:1234/v1/chat/completions
OBSBOT_DEVICENoCamera device path/dev/video0
OBSBOT_VL_MODELNoVision-language model name for LM Studioqwen2.5-vl-7b-instruct
OBSBOT_OUTPUT_DIRNoDirectory for captured images/tmp/obsbot

Capabilities

Server capabilities have not been inspected yet.

Tools

Functions exposed to the LLM to take actions

NameDescription
get_gimbal_positionB

Get current camera gimbal position (pan, tilt, zoom) with human-readable directions

control_gimbalB

Control camera gimbal. Pan: NEGATIVE=RIGHT, POSITIVE=LEFT. Tilt: POSITIVE=UP, NEGATIVE=DOWN.

center_cameraA

Return camera to center position (pan=0, tilt=0, zoom=0)

take_snapshotC

Capture camera snapshot with optional LM Studio visual analysis

look_and_analyzeC

Move camera to position and take analyzed snapshot

scan_areaC

Systematically scan area with multiple snapshots

check_system_statusB

Check camera system health and LM Studio availability

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Radar105/obsbot-camera-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server