Skip to main content
Glama

Server Configuration

Describes the environment variables required to run the server.

NameRequiredDescriptionDefault
LM_STUDIO_URLNoLM Studio API endpoint URL for visual analysishttp://localhost:1234/v1/chat/completions
OBSBOT_DEVICENoCamera device path/dev/video0
OBSBOT_VL_MODELNoVision-language model name for LM Studioqwen2.5-vl-7b-instruct
OBSBOT_OUTPUT_DIRNoDirectory for captured images/tmp/obsbot

Tools

Functions exposed to the LLM to take actions

NameDescription
get_gimbal_position

Get current camera gimbal position (pan, tilt, zoom) with human-readable directions

control_gimbal

Control camera gimbal. Pan: NEGATIVE=RIGHT, POSITIVE=LEFT. Tilt: POSITIVE=UP, NEGATIVE=DOWN.

center_camera

Return camera to center position (pan=0, tilt=0, zoom=0)

take_snapshot

Capture camera snapshot with optional LM Studio visual analysis

look_and_analyze

Move camera to position and take analyzed snapshot

scan_area

Systematically scan area with multiple snapshots

check_system_status

Check camera system health and LM Studio availability

Prompts

Interactive templates invoked by user choice

NameDescription

No prompts

Resources

Contextual data attached and managed by the client

NameDescription

No resources

Latest Blog Posts

MCP directory API

We provide all the information about MCP servers via our MCP API.

curl -X GET 'https://glama.ai/api/mcp/v1/servers/Radar105/obsbot-camera-mcp'

If you have feedback or need assistance with the MCP directory API, please join our Discord server